Can a subspace be written as the direct sum of two orthogonal spaces?

  • Thread starter Thread starter matness
  • Start date Start date
  • Tags Tags
    Orthogonal
Click For Summary
SUMMARY

The discussion centers on the theorem stating that if W is a subspace of V, then V can be expressed as the direct sum of W and its orthogonal complement CW. Participants explore the implications of this theorem, particularly in the context of Hilbert spaces and the uniqueness of representation. Counterexamples are provided, particularly concerning the space of continuous functions on the interval [0,1], where it is demonstrated that not all subspaces have orthogonal complements that yield the entire space. The conversation highlights the necessity of closed subspaces for the theorem to hold true in infinite dimensions.

PREREQUISITES
  • Understanding of Hilbert spaces and their properties
  • Knowledge of orthogonal complements in vector spaces
  • Familiarity with inner product spaces and their definitions
  • Basic concepts of linear algebra, particularly direct sums
NEXT STEPS
  • Study the properties of closed subspaces in Hilbert spaces
  • Learn about the concept of orthogonal projectors in functional analysis
  • Explore counterexamples in infinite-dimensional spaces, particularly in the context of continuous functions
  • Investigate the implications of the Riesz Representation Theorem in relation to orthogonal complements
USEFUL FOR

Mathematicians, students of functional analysis, and anyone interested in the properties of Hilbert spaces and orthogonal decompositions in vector spaces.

matness
Messages
90
Reaction score
0
A thm says:

if W is a subspace of V then V = direct sum of W and CW( ort. complement of W)
i.e. for all v € V there exist w € W & w' € CW s.t. v= w+w'


Does it mean that we can write a function as a sum of two orthogonal funcs ?

Also i don't know the proof for this thm ,Can you also send a simple sketch of proof ?
Thanks..
 
Physics news on Phys.org
What functions are you talking about. If you are thinking of a vector space of functions- the set of functions continuous on an interval, or the set of all polynomials, for example- then what that says is that if we have a subspace of such functions, we can find the orthogonal complement subspace (again, that depends on how you have defined the inner product: (f,g)= \int_a^b f(t)g(t)dt for example). Yes, in that situation, any function can be written as the sum of two functions, orthogonal to one another. Given such a subspace and its orthogonal complement, you can find a basis for each that, together, forms a basis for the whole vector space. Given any function, f, in the vector space, write it as a linear combination of the basis functions. The let F be the linear combination taking only those basis functions in the first subspace, G the linear combination taking only those basis functions in the orthogonal complement: f= F+ G and they are orthogonal.
 
Im not sure I understand your answer Halls. Take the space of continuous functions on the interval [0,1] with the inner product you gave, by integration. Then take the subspace of functions vanishing at 0.

I claim the only function orthogonal to this subspace is the zero function. so it is not true that a basis of this subspace and a basis of its orthocomplement yield a basis of the whole space. I.e. there seems to be a problem with non closed subspaces

what am i misunderstanding?

of course in finite dimensions there is no problem.
 
Well,if the Hilbert space is separable,then it admits a countable subset everywhere dense in the Hilbert space which can be structured as an orthonormal basis.
In case the Hilbert space admits an orthonormal basis,then it's easy to see that it admits a decomposition as a direct sum of orthonormal subspaces.

This is accomplished by introducing orthogonal projectors.

Let the unit operator on the Hilbert space \hat{1}_{\mathcal{H}} admit the following decomposition

\hat{1}_{\mathcal{H}}=\left[\frac{1}{2}\left(\hat{1}_{\mathcal{H}}+\hat{U}\right)\right]+\left[\frac{1}{2}\left(\hat{1}_{\mathcal{H}}-\hat{U}\right) \right] (1)

,where i require that the densly defined linear operator \hat{U} be
*bounded
*self-adjoint
*unitary.

Then it's easy to see that both operators in the RHS (the ones in straight brackets) are
*bounded
*self-adjoint
*idempotent

,therefore they are orthogonal projectors.Moreover,they are mutually orthogonal.

So the separable Hilbert space admits a decomposition as a direct sum of two subspaces orthogonal on each other.

Daniel.
 
In Quantum Mechanics,the previous construction appears in the case of multiparticle systems built with 2 identical particles.

That \hat{U} is the permutation operator and the operator with the "+" in the curled bracket is called "symmetrizer operator" and the one with the "-" is called "antisymmetrizer operator".

Daniel.
 
you are using the term basis in two different ways. a countable hilbert basis is not a vector basis, it is a sxequence of orthogonal vectors whose span is dense.

again you are using tyhe etrm decomposition in two different senses, as the orthogonal decompositon obtaiend is not a vector direct sum but a hilbert direct sum.

none of these facts contradict ym countyerexample to the asertion that every subspace of a hilbert sapce has an orthogonal complement such that every vector is a sum of one vector in one plus one vector in the other.

I.e. I have givena copunterexample to the "theprem" asserted above:

"A thm says:

if W is a subspace of V then V = direct sum of W and CW( ort. complement of W)
i.e. for all v € V there exist w € W & w' € CW s.t. v= w+w'."

i.e. if you take the subspace W of functions vanishing at 0 in the space V of all continuous functons on [0,1], then it is not possible to write the constant function f(x) = 1, as a sum of a function in W and a function in tis orthocomplement.

Similarly Halls' statement: "Given such a subspace and its orthogonal complement, you can find a basis for each that, together, forms a basis for the whole vector space." is false in this setting, if you assume that "basis" means vector basis.

This is quite different and much stronger than the weak statement that every function f can be written as a sum of two orthogonal functions, which is of course a trivial statement using f and 0.

Again it is of ciourse completely trivial that a hilbert space has some direct sum decomposition nito orthogonal subspaces, but still it is apparently not true that every subspace of a hilbert space is such a summand. for one thing not every subspace in a hilbert space is closed.
 
Last edited:
are you saying the set of continuous functions is a Hilbert Space?

If I remember correctly the original theory holds for any inner product space and a complete subspace. Pretty sure I don't remember the proof right now.
 
matness said:
A thm says:

if W is a subspace of V then V = direct sum of W and CW( ort. complement of W)
i.e. for all v € V there exist w € W & w' € CW s.t. v= w+w'

Things became more complicated than i think .
Actually i don't know too much about Hilbert spaces , therefore i did not understand some parts

After reading a proof of above thm i see that bases are taken always as finite
so, is the thm false for infinite dim space(like the cont funcs space) OR
is there still a proof for infinite case?

if thm is true for inf dims, i have one more problem
> mw says:
"f can be written as a sum of two orthogonal functions, which is of course a trivial statement using f and 0."

and thm says the represantation v= w+w' is unique,
Conclusion: f and 0 is the only way for this representation

>> there should be a mistake in my conclusions but what?
 
there are two hypotheses for the theorem that a subspace plus its complement equals the original space.

1) the original space is a hilbert space.

2) the subspace is closed.

so dextercioby's statement, although in hilbert space, is too imprecise to see what statement he is actually making, and halls of ivy's statement omits both hypotheses.

i did not say the space of continuous functions is a hilbert space, and i do not say that. still it suffices to give a counterexample to the given statement, that a subspace plus its orthocomplement equals the whole space. i used it merely because halls used it.

if you want a counterexample in a separable infinite dimensional hilbert space, just take any complete (i.e. maximal) orthonormal set, and then take the vector span of this set. that gives a dense, proper, non closed subspace, such that its orthocomplement is the zero space. In particular the sum of the space and its orthocomplement is not the whole space.


This stuff is found in any book on hilbert space.
 
Last edited:
  • #10
Are you sure it isn't sufficient to have just an inner product space and a complete subspace? It hass been about several months since I've looked at this stuff but I seem to recall that in the proof completeness only comes in when you are trying to show that a certain sequence in the subspace converges.

To Matness
If you don't know about Hilbert spaces you shouldn't concern yourself with the case of infinite dimensions. As for what the theorem says about f+0=0. This is not just an issue in the infinite case as 0 is orthogonal to every vector. What the theorem says is that given a vector f and a closed subspace W, there are unique w in W and w' in CW such that w+w'=f. And if f is in W or CW then f+0 is the only way to write f as the sum of vectors from those two sets. However f may be in neither W or CW. In which case f+0 is not actually in the form of a sum of vectors from those two sets. So to clarify the representation as the sum of orthogonal vectors is unique up to your choice of W. Does that make any more sense?

Thanks,
Steven

ps to mw. I certainly didn't think that you actually believed the set of continuous functions was a Hilbert space but that does seem to be what one of your messages says. My comment was simply intended as an opportunity for you to clarify yourself.
 
Last edited:
  • #11
you are quite right that my claim that i had given a counterexample in hilbert space implied that the counterexample which i did give occurs in hilbert space, although it does not.

i cannot remember at the moment what i was thinking, whether i was being careless, or merely meant that the type of example i gave also existed in hilbert space.

i did state quite precisely the original statement, to which my counterexample in the space of continuous functions applied however, and hilbert space was not mentioned there.

i was nonetheless being careless, and somewhat challenged by trying to simultaneously give counterexamples to all the previous wrong statements. I.e. the original claim made is not always true, and in particular is not true in the space of continuous functions, and is also not true in a hilbert space, and is not true in a separable hilbert space.

but in fact you are right that the explicit example i gave was not a counterexample to all these statements. for a hilbert space, take the example above of the vector closure of a complete orthonormal set.

now you are asking whether any complete subspace of an inner product space has an orthocomplement which is also an algebraic complement? I don't know immediately off hand. It does seem that one should be able to project onto it, using completeness.

yes i think you are right, but the hypothesis is a bit odd, i.e. to have a complete subspace of an incomplete space.

no maybe not, for it seems to imply that every finite dimensional subspace of an inner product space has a nice ortho complement. that seems useful.

indeed the proof follows immediately from the hilbert space case, as follows:

let V be any inner product space and U a complete subspace. Then complete V to a hilbert space W.

then let x be any vector in V. I claim x equals y+z where y is in U and z is perpendicular to y. it is true in W, i.e. since W isa hilbert space and U is complete hence closed in W, there is a y in U and a z in W such that y and z are perpendicular and x = y+z. But since y is in U it is also in V, hence z = x-y is also in V. So we have solved the problem in V.

so there really is no greater generality.

please forgive me; sometimes especially late at night, i get impatient with dogmatically phrased wrong assertions, and give hasty unclear refutations of them. but them i am just doing the same thing myself.
 
Last edited:
  • #12
now i understand better thank for your helps
 

Similar threads

Replies
3
Views
2K
  • · Replies 7 ·
Replies
7
Views
2K
  • · Replies 6 ·
Replies
6
Views
4K
  • · Replies 5 ·
Replies
5
Views
7K
  • · Replies 3 ·
Replies
3
Views
5K
  • · Replies 3 ·
Replies
3
Views
6K
  • · Replies 4 ·
Replies
4
Views
3K
  • · Replies 7 ·
Replies
7
Views
3K
  • · Replies 4 ·
Replies
4
Views
4K
  • · Replies 2 ·
Replies
2
Views
1K