Can a Basis for a Subspace Always Include Specific Elements of a Larger Basis?

  • Thread starter Thread starter ehsan_thr
  • Start date Start date
  • Tags Tags
    Basis Subspace
ehsan_thr
Messages
3
Reaction score
0
suppose that {ei} 1<=i<=n is a basis for v and w is a subspace of v of dimension m<n ... . so Can we always find a basis for w which includes m elements of {ei} 1<=i<=n ?

does always w include m elements of {ei} 1<=i<=n ?
 
Physics news on Phys.org
What does it mean for a list of vectors to be the basis for a vector space V ?
 
It means ,every vector of v is a linear combination of elements of the basis ... . ( The elements of v must be linear independent )
I have tried to prove my question , but i coudn't till now ... . my ultimate goal is to prove that for Every proper subspace W of a finite dimensional inner product space V there is a non zero vector x which is orothogonal to w ... .
 
ehsan_thr said:
suppose that {ei} 1<=i<=n is a basis for v and w is a subspace of v of dimension m<n ... . so Can we always find a basis for w which includes m elements of {ei} 1<=i<=n ?

does always w include m elements of {ei} 1<=i<=n ?

No. For example, let v be the set of linear polynomials and let {ei}= {x- 1, x+ 1}. Let w be the set of all multiples of x. Then w is a one dimensional subspace of v and any basis for w must be {ax} for some non-zero number a. None of those is in {ei}.

The other way is true: given any basis for subspace w we can extend it to a basis for v.
 
No. For example, let v be the set of linear polynomials and let {ei}= {x- 1, x+ 1}. Let w be the set of all multiples of x. Then w is a one dimensional subspace of v and any basis for w must be {ax} for some non-zero number a. None of those is in {ei}.

The other way is true: given any basis for subspace w we can extend it to a basis for v.
T 01:59 PM

Thank you , so much , i will find another solution for my problem ... .
 
Here's how I might approach the problem:

Start with an orthogonal basis of W, and pick any vector v in V that's not in W. Using your basis for W, can you use v to find a vector that's orthogonal to W?
 
##\textbf{Exercise 10}:## I came across the following solution online: Questions: 1. When the author states in "that ring (not sure if he is referring to ##R## or ##R/\mathfrak{p}##, but I am guessing the later) ##x_n x_{n+1}=0## for all odd $n$ and ##x_{n+1}## is invertible, so that ##x_n=0##" 2. How does ##x_nx_{n+1}=0## implies that ##x_{n+1}## is invertible and ##x_n=0##. I mean if the quotient ring ##R/\mathfrak{p}## is an integral domain, and ##x_{n+1}## is invertible then...
The following are taken from the two sources, 1) from this online page and the book An Introduction to Module Theory by: Ibrahim Assem, Flavio U. Coelho. In the Abelian Categories chapter in the module theory text on page 157, right after presenting IV.2.21 Definition, the authors states "Image and coimage may or may not exist, but if they do, then they are unique up to isomorphism (because so are kernels and cokernels). Also in the reference url page above, the authors present two...
When decomposing a representation ##\rho## of a finite group ##G## into irreducible representations, we can find the number of times the representation contains a particular irrep ##\rho_0## through the character inner product $$ \langle \chi, \chi_0\rangle = \frac{1}{|G|} \sum_{g\in G} \chi(g) \chi_0(g)^*$$ where ##\chi## and ##\chi_0## are the characters of ##\rho## and ##\rho_0##, respectively. Since all group elements in the same conjugacy class have the same characters, this may be...
Back
Top