Can a subspace be written as the direct sum of two orthogonal spaces?

  • Thread starter matness
  • Start date
  • Tags
    Orthogonal
In summary: If the Hilbert space is separable, then it does admit an orthonormal basis, but this is not the same as a vector basis.
  • #1
matness
90
0
A thm says:

if W is a subspace of V then V = direct sum of W and CW( ort. complement of W)
i.e. for all v € V there exist w € W & w' € CW s.t. v= w+w'


Does it mean that we can write a function as a sum of two orthogonal funcs ?

Also i don't know the proof for this thm ,Can you also send a simple sketch of proof ?
Thanks..
 
Physics news on Phys.org
  • #2
What functions are you talking about. If you are thinking of a vector space of functions- the set of functions continuous on an interval, or the set of all polynomials, for example- then what that says is that if we have a subspace of such functions, we can find the orthogonal complement subspace (again, that depends on how you have defined the inner product: [tex](f,g)= \int_a^b f(t)g(t)dt[/tex] for example). Yes, in that situation, any function can be written as the sum of two functions, orthogonal to one another. Given such a subspace and its orthogonal complement, you can find a basis for each that, together, forms a basis for the whole vector space. Given any function, f, in the vector space, write it as a linear combination of the basis functions. The let F be the linear combination taking only those basis functions in the first subspace, G the linear combination taking only those basis functions in the orthogonal complement: f= F+ G and they are orthogonal.
 
  • #3
Im not sure I understand your answer Halls. Take the space of continuous functions on the interval [0,1] with the inner product you gave, by integration. Then take the subspace of functions vanishing at 0.

I claim the only function orthogonal to this subspace is the zero function. so it is not true that a basis of this subspace and a basis of its orthocomplement yield a basis of the whole space. I.e. there seems to be a problem with non closed subspaces

what am i misunderstanding?

of course in finite dimensions there is no problem.
 
  • #4
Well,if the Hilbert space is separable,then it admits a countable subset everywhere dense in the Hilbert space which can be structured as an orthonormal basis.
In case the Hilbert space admits an orthonormal basis,then it's easy to see that it admits a decomposition as a direct sum of orthonormal subspaces.

This is accomplished by introducing orthogonal projectors.

Let the unit operator on the Hilbert space [tex] \hat{1}_{\mathcal{H}} [/tex] admit the following decomposition

[tex] \hat{1}_{\mathcal{H}}=\left[\frac{1}{2}\left(\hat{1}_{\mathcal{H}}+\hat{U}\right)\right]+\left[\frac{1}{2}\left(\hat{1}_{\mathcal{H}}-\hat{U}\right) \right] [/tex] (1)

,where i require that the densly defined linear operator [itex] \hat{U}[/itex] be
*bounded
*self-adjoint
*unitary.

Then it's easy to see that both operators in the RHS (the ones in straight brackets) are
*bounded
*self-adjoint
*idempotent

,therefore they are orthogonal projectors.Moreover,they are mutually orthogonal.

So the separable Hilbert space admits a decomposition as a direct sum of two subspaces orthogonal on each other.

Daniel.
 
  • #5
In Quantum Mechanics,the previous construction appears in the case of multiparticle systems built with 2 identical particles.

That [itex] \hat{U} [/itex] is the permutation operator and the operator with the "+" in the curled bracket is called "symmetrizer operator" and the one with the "-" is called "antisymmetrizer operator".

Daniel.
 
  • #6
you are using the term basis in two different ways. a countable hilbert basis is not a vector basis, it is a sxequence of orthogonal vectors whose span is dense.

again you are using tyhe etrm decomposition in two different senses, as the orthogonal decompositon obtaiend is not a vector direct sum but a hilbert direct sum.

none of these facts contradict ym countyerexample to the asertion that every subspace of a hilbert sapce has an orthogonal complement such that every vector is a sum of one vector in one plus one vector in the other.

I.e. I have givena copunterexample to the "theprem" asserted above:

"A thm says:

if W is a subspace of V then V = direct sum of W and CW( ort. complement of W)
i.e. for all v € V there exist w € W & w' € CW s.t. v= w+w'."

i.e. if you take the subspace W of functions vanishing at 0 in the space V of all continuous functons on [0,1], then it is not possible to write the constant function f(x) = 1, as a sum of a function in W and a function in tis orthocomplement.

Similarly Halls' statement: "Given such a subspace and its orthogonal complement, you can find a basis for each that, together, forms a basis for the whole vector space." is false in this setting, if you assume that "basis" means vector basis.

This is quite different and much stronger than the weak statement that every function f can be written as a sum of two orthogonal functions, which is of course a trivial statement using f and 0.

Again it is of ciourse completely trivial that a hilbert space has some direct sum decomposition nito orthogonal subspaces, but still it is apparently not true that every subspace of a hilbert space is such a summand. for one thing not every subspace in a hilbert space is closed.
 
Last edited:
  • #7
are you saying the set of continuous functions is a Hilbert Space?

If I remember correctly the original theory holds for any inner product space and a complete subspace. Pretty sure I don't remember the proof right now.
 
  • #8
matness said:
A thm says:

if W is a subspace of V then V = direct sum of W and CW( ort. complement of W)
i.e. for all v € V there exist w € W & w' € CW s.t. v= w+w'

Things became more complicated than i think .
Actually i don't know too much about Hilbert spaces , therefore i did not understand some parts

After reading a proof of above thm i see that bases are taken always as finite
so, is the thm false for infinite dim space(like the cont funcs space) OR
is there still a proof for infinite case?

if thm is true for inf dims, i have one more problem
> mw says:
"f can be written as a sum of two orthogonal functions, which is of course a trivial statement using f and 0."

and thm says the represantation v= w+w' is unique,
Conclusion: f and 0 is the only way for this representation

>> there should be a mistake in my conclusions but what?
 
  • #9
there are two hypotheses for the theorem that a subspace plus its complement equals the original space.

1) the original space is a hilbert space.

2) the subspace is closed.

so dextercioby's statement, although in hilbert space, is too imprecise to see what statement he is actually making, and halls of ivy's statement omits both hypotheses.

i did not say the space of continuous functions is a hilbert space, and i do not say that. still it suffices to give a counterexample to the given statement, that a subspace plus its orthocomplement equals the whole space. i used it merely because halls used it.

if you want a counterexample in a separable infinite dimensional hilbert space, just take any complete (i.e. maximal) orthonormal set, and then take the vector span of this set. that gives a dense, proper, non closed subspace, such that its orthocomplement is the zero space. In particular the sum of the space and its orthocomplement is not the whole space.


This stuff is found in any book on hilbert space.
 
Last edited:
  • #10
Are you sure it isn't sufficient to have just an inner product space and a complete subspace? It hass been about several months since I've looked at this stuff but I seem to recall that in the proof completeness only comes in when you are trying to show that a certain sequence in the subspace converges.

To Matness
If you don't know about Hilbert spaces you shouldn't concern yourself with the case of infinite dimensions. As for what the theorem says about f+0=0. This is not just an issue in the infinite case as 0 is orthogonal to every vector. What the theorem says is that given a vector f and a closed subspace W, there are unique w in W and w' in CW such that w+w'=f. And if f is in W or CW then f+0 is the only way to write f as the sum of vectors from those two sets. However f may be in neither W or CW. In which case f+0 is not actually in the form of a sum of vectors from those two sets. So to clarify the representation as the sum of orthogonal vectors is unique up to your choice of W. Does that make any more sense?

Thanks,
Steven

ps to mw. I certainly didn't think that you actually believed the set of continuous functions was a Hilbert space but that does seem to be what one of your messages says. My comment was simply intended as an opportunity for you to clarify yourself.
 
Last edited:
  • #11
you are quite right that my claim that i had given a counterexample in hilbert space implied that the counterexample which i did give occurs in hilbert space, although it does not.

i cannot remember at the moment what i was thinking, whether i was being careless, or merely meant that the type of example i gave also existed in hilbert space.

i did state quite precisely the original statement, to which my counterexample in the space of continuous functions applied however, and hilbert space was not mentioned there.

i was nonetheless being careless, and somewhat challenged by trying to simultaneously give counterexamples to all the previous wrong statements. I.e. the original claim made is not always true, and in particular is not true in the space of continuous functions, and is also not true in a hilbert space, and is not true in a separable hilbert space.

but in fact you are right that the explicit example i gave was not a counterexample to all these statements. for a hilbert space, take the example above of the vector closure of a complete orthonormal set.

now you are asking whether any complete subspace of an inner product space has an orthocomplement which is also an algebraic complement? I don't know immediately off hand. It does seem that one should be able to project onto it, using completeness.

yes i think you are right, but the hypothesis is a bit odd, i.e. to have a complete subspace of an incomplete space.

no maybe not, for it seems to imply that every finite dimensional subspace of an inner product space has a nice ortho complement. that seems useful.

indeed the proof follows immediately from the hilbert space case, as follows:

let V be any inner product space and U a complete subspace. Then complete V to a hilbert space W.

then let x be any vector in V. I claim x equals y+z where y is in U and z is perpendicular to y. it is true in W, i.e. since W isa hilbert space and U is complete hence closed in W, there is a y in U and a z in W such that y and z are perpendicular and x = y+z. But since y is in U it is also in V, hence z = x-y is also in V. So we have solved the problem in V.

so there really is no greater generality.

please forgive me; sometimes especially late at night, i get impatient with dogmatically phrased wrong assertions, and give hasty unclear refutations of them. but them i am just doing the same thing myself.
 
Last edited:
  • #12
now i understand better thank for your helps
 

What is the definition of an orthogonal complement?

The orthogonal complement of a vector space V is the set of all vectors in the ambient space that are perpendicular to every vector in V.

What is the relationship between a vector and its orthogonal complement?

A vector and its orthogonal complement are two subspaces that are mutually perpendicular to each other.

How do you find the orthogonal complement of a given vector or subspace?

To find the orthogonal complement of a vector, you can use the dot product to determine the perpendicular vector. To find the orthogonal complement of a subspace, you can use the Gram-Schmidt process to find a basis for the subspace and then determine the vectors that are perpendicular to the basis vectors.

What is the significance of an orthogonal complement in linear algebra?

The orthogonal complement is important in linear algebra because it allows for the decomposition of a vector space into two mutually perpendicular subspaces. This decomposition is useful in solving systems of linear equations and in understanding the geometry of vector spaces.

Can two vector spaces have the same orthogonal complement?

Yes, it is possible for two different vector spaces to have the same orthogonal complement. This occurs when the two vector spaces are mutually perpendicular to each other.

Similar threads

  • Linear and Abstract Algebra
Replies
3
Views
1K
Replies
2
Views
573
  • Linear and Abstract Algebra
Replies
5
Views
6K
Replies
3
Views
2K
  • Linear and Abstract Algebra
Replies
3
Views
3K
  • Linear and Abstract Algebra
Replies
6
Views
3K
  • Calculus and Beyond Homework Help
Replies
4
Views
953
  • Linear and Abstract Algebra
Replies
7
Views
2K
  • Linear and Abstract Algebra
Replies
4
Views
2K
  • Special and General Relativity
Replies
4
Views
755
Back
Top