# Homework Help: Dimension and basis for subspace determined by given vectors

1. Jan 23, 2010

### Combinatus

1. The problem statement, all variables and given/known data

Assume that $$e_1 ,..., e_n$$ is a basis for the vector space V. Let W be the linear subspace determined (formed?) by the vectors $$e_{1}-e_{2}, e_{2}-e_{3}, ..., e_{n-1}-e_{n}, e_{n}-e_{1}$$. Determine the dimension of W, and a basis for W.

2. Relevant equations

3. The attempt at a solution

After trying a two separate (and somewhat lengthy) approaches, both yielded that the dimension of W is n, and $$e_{1}-e_{2}, e_{2}-e_{3}, ..., e_{n-1}-e_{n}, e_{n}-e_{1}$$ forms the basis for W, i.e. no manipulation needed since the aforementioned vectors should already be linearly independent.

The key to the problem states that the subspace states that the dimension should rather be n-1, and the basis $$e_{1}-e_{2}, e_{2}-e_{3}, ..., e_{n-1}-e_{n}$$.

After considering the key applied to a 3D vector space with the basis $$e_1, e_2, e_3$$, the key makes sense, since $$e_3-e_1$$ will be parallel to the plane formed by $$e_1-e_2$$ and $$e_2-e_3$$. I'm not certain how I should apply this knowledge to n-dimensional space.

2. Jan 23, 2010

### Combinatus

I figured it out. Basically, I showed that the vectors $$e_1-e_2, e_2-e_3, ... , e_n-e_1$$ will for any dimension be linearly dependent (since the vector sum of all of them will point back to the origin), and excluding one made it possible to show that the new set of vectors will be linearly dependent. It was a reasonably entertaining problem though (which wasn't too hard after all), so if you want to give it a shot, or try a different approach, go for it!

3. Jan 24, 2010

### HallsofIvy

Another way to do this is to note that the matrix having the new vectors as columns has all "1"s along the main diagonal and "-1" just below the main diagonal. Then it's easy to show that this matrix is non-singular so the vectors are independent. But your method is perfectly good.