# Equivalence relation in complementary subspace

1. Sep 11, 2008

### guhan

I was revising linear algebra and came across the topic of 'constructing complementary subspace given a subspace' - and since the proof (that used Zorn's lemma) of its (complementary subspace's) existence was not constructive, the author defined an equivalence relation in constructing a complementary (to the subspace W) subspace, U, of the vector space V.. The equivalence relation, $$\equiv$$w, is defined by:
(u $$\equiv$$w v) iff (u-v) $$\in$$ W
I understand this algebraically, but I am not able to 'visualize' it correctly. Can some one, please, clarify this with an example... like, let, V=R4 and W={(x1,x2,0,0) | xi $$\in$$ R}.
How does the space of V get partitioned?
How does a partition of V help in the construction of U?
Is U unique? Because, I can construct only one U for the above V and W. Am I missing something?

2. Sep 11, 2008

### HallsofIvy

Staff Emeritus
Two vectors u and v will be "equivalent" in this sense if and only if their difference is in W: in this example if and only if the last two coordinates are the same. For example, (1, 3, 2, 5) and (7, -4, 2, 5) are the same. Thus, each equivalence class can be identified with the last two coordinates: every vector in the equivalence class containing both (1, 3, 2, 5) and (7, -4, 2, 5) can be identified with (0, 0, 2, 5) (which is, of course, also in that equivalence class). V is "partitioned" in that sense (it is not a true partition) into W= {(a, b, 0, 0)} and its complimentary subspace {(0, 0, c, d)}.

(This is not a true "partition" because, of course, there exist vectors in R4, such as (1, 1, 1, 1) that are not in either W or its compliment but are linear combinations of such vectors.)

3. Sep 11, 2008

### guhan

Correct me if I am wrong...
The equivalence class (in your example) containing (0,0,2,5) also contains (1,3,2,5),(7,-4,2,5) etc. How does the above equivalence relation 'pick out' or 'favour' (0,0,2,5) over the other in its class, in the construction of the complementary space,U? Because, for a W of (a,b,0,0), the other vectors in the equiv class of (0,0,2,5) do not belong to the complementary space right? (else, the intersection of W and U will be more than just the zero vector)

Also, isnt this equivalent relation a complete partition on V? since, the (1,1,1,1) in your counter example does belong to the equiv class containing (0,0,1,1)?

4. Sep 11, 2008

### guhan

The problem I am facing is that I am not able to 'see' how this partitioning of the vector space, V, into equivalent classes by that relation helps in constructing the subspace, U (the complementary of subspace, V). The reason I am looking for such a way of construction is because the author starts this topic with the following...
"The proof (that for any given subspace there always exists complementary subspace) has a distinctly non-constructive feel to it, which is typical of proofs invoking Zorn's lemma. A more direct way to arrive at the vector space complementary to a given subspace W is to define an equivalence relation, $$\equiv$$ w, on V..."
(this is a mathematical physics book)

5. Sep 11, 2008

### HallsofIvy

Staff Emeritus
It doesn't. Any member of that class could represent the whole class. However, the "whole class" differs from any other class only in that the last 2 components, 2, 5, are the same for all members of the class. The first two components are irrelevant.

The last part of that is incorrect. W is a subspace. U is a subspace given by the equivalence classes in a particular way.

What "counter example" are you talking about?

6. Sep 12, 2008

### morphism

Technically speaking the equivalence classes you obtain aren't a vector subspace of V, but they form a vector space which is isomorphic to one. (For details, look up quotient vector spaces. What you're dealing with is the quotient vector space V/W; it's isomorphic to a complementary subspace of W.)

This is just a construction. Complementary subspaces are almost never unique. For instance, take the horizontal x-axis in the real plane. This subspace is complemented in the plane by any other line through the origin.

7. Sep 12, 2008

### guhan

Okay! These cleared most of my problem.

If V is a vector space and W is a subspace...
Now, can I go further and say that, the set that consists of one element from each equivalent class (and just the zero vector from the eq class containing W) is a complementary subspace of the subspace, W?
In the example in the quote above, that would mean all curves (that are strictly increasing/decreasing) and not just the straight lines, through origin.

8. Sep 13, 2008

### morphism

No, that set is not a complementary subspace of W - it need not even be a subspace!

The idea is that if you take W and collapse it, you will get a new vector space consisting of these equivalence classes you defined in post #1 (we define addition and scalar multiplication in the obvious way: +[v]=[u+v] and k[v]=[kv]; it's easy to check that these definitions are good (i.e. addition and scaling are 'well-defined'), and turn the set of equivalence classes into a vector space).

It's like when you take the integers Z and consider the subset of multiples of some n, i.e. the set nZ. When you look at the equivalence classes in this setting, you're basically looking at the integers modulo n. This set, which we denote by Z/nZ, naturally carries structure similar to Z (it's a ring). Analogously, we're taking W in V and forming the vector space V/W. Now this vector space has a basis*, say {[v_i]}. Note that [v_i] is then necessarily nonzero, i.e. v_i does not belong to W, for each i. I claim that the set {v_i} is linearly independent in V. Can you prove this claim? Having done this, consider the space U spanned by {v_i}. It's clear that W and U intersect only in {0} (why?), and that moreover W+U=V. To see why the latter equation holds, take any v in V, and consider its equivalence class [v]. We can find scalars $\lambda_1, \ldots, \lambda_n$ and vectors $v_{i_1}, \ldots, v_{i_n}$ such that

$$[v]=\lambda_1 [v_{i_1}] + \cdots + \lambda_n [v_{i_n}] = [\lambda_1 v_{i_1} + \cdots + \lambda_n v_{i_n}]$$.

In other words, $v - (\lambda_1 v_{i_1} + \cdots + \lambda_n v_{i_n}) \in W$, so that there is a w in W such that $v = w + (\lambda_1 v_{i_1} + \cdots + \lambda_n v_{i_n})$. But by definition, $\lambda_1 v_{i_1} + \cdots + \lambda_n v_{i_n} \in U$. This proves that v is in U+V.

*: Well, to actually prove that every vector space has a basis we need Zorn's lemma (to ensure that even infinite-dimensional vector spaces have bases). Since you're trying to avoid Zorn's lemma, we can stick to those instances when V/W is finite dimensional, i.e. when W is a 'sufficiently large' subspace of V.

9. Oct 21, 2008

### guhan

Oh yeah, that does not even form a (sub)space!
Thanks. Your explanation did clear up my confusion!
And I think the linear independence of { v_i } in U can be proven by the linear independence of { [v_i] } in V/W. Also, using the same linear independence, we can say that intersection of W and U is {0} alone.