Symmetric bilinear forms on infinite dimensional spaces

In summary, the conversation discusses the diagonalisability of a symmetric bilinear form B on a finite-dimensional vector space V over any field F of characteristic not 2. It is mentioned that this is possible for infinite-dimensional vector spaces as well, provided they have an algebraic basis. A possible counterexample is given and the possibility of finding an orthonormal basis is discussed. The conversation also touches on the existence of a symmetric bilinear form on a Hilbert space with no orthogonal basis, and the lack of theory on symmetric bilinear forms on infinite-dimensional spaces. The example of T:L^2[0,1]\rightarrow L^2[0,1] with T(f)(x)=xf(x) and its corresponding symmetric bilinear
  • #1
henry_m
160
2
It is a well known fact that a symmetric bilinear form B on a finite-dimensional vector space V over any field F of characteristic not 2 is diagonalisable, i.e. there exists a basis [itex]\{e_i\}[/itex] such that [itex]B(e_i,e_j)=0[/itex] for [itex]i\neq j[/itex].

Does the same hold over an infinite dimensional vector space, provided that it has an algebraic basis? My insticts say no but I can't come up with a counterexample.

Here's one possible candidate. Let V be the set of all sequences [itex]\{x_i\}_{i\geq1}[/itex] over F for which only finitely many of the [itex]x_i[/itex] are nonzero. Let B be the following symmetric bilinear form:
[tex]B(\{x_i\},\{y_i\})=\sum_{i\geq1} (x_i y_{i+1}+ x_{i+1}y_i)[/tex]
which is a finite sum. Can you find a basis for V which is orthonormal wrt B? Can you prove that no such basis exists?

Thanks!
 
Physics news on Phys.org
  • #2
henry_m said:
It is a well known fact that a symmetric bilinear form B on a finite-dimensional vector space V over any field F of characteristic not 2 is diagonalisable, i.e. there exists a basis [itex]\{e_i\}[/itex] such that [itex]B(e_i,e_j)=0[/itex] for [itex]i\neq j[/itex].

Does the same hold over an infinite dimensional vector space, provided that it has an algebraic basis? My insticts say no but I can't come up with a counterexample.

I think yes. At least for vector spaces with countable dimension. Some sort of Gram-Shmidt process could give you an orthogonal basis. But this will, in general, not be an orthonormal basis!

Here's one possible candidate. Let V be the set of all sequences [itex]\{x_i\}_{i\geq1}[/itex] over F for which only finitely many of the [itex]x_i[/itex] are nonzero. Let B be the following symmetric bilinear form:
[tex]B(\{x_i\},\{y_i\})=\sum_{i\geq1} (x_i y_{i+1}+ x_{i+1}y_i)[/tex]
which is a finite sum. Can you find a basis for V which is orthonormal wrt B? Can you prove that no such basis exists?

Check here: http://www.math.unl.edu/~bharbourne1/M818Spr05/OthogonalBasisAlgorithmsRev.pdf
This is a Gram-Schmidt procedure for arbitrary symmetric bilinear forms. Of course, the process outlined there is only for finite-dimensional spaces, but I don't think it's hard to check that we can actually extend this to the countable case.

I started the calculations starting from the basis {(1,0,0,0,...),(1,1,0,0,...),(1,1,1,0,...),...} and I obtained the following 4 elements:

[tex]\{(1,1,0,0,0,...),(1/2,-1/2,0,0,0,...),(-1,0,1,0,0,...),(-3/2,0,3/2,1,0,...)\}[/tex]

I have no doubt that this would give you an orthogonal basis in the end.

However, I don't think there is an orthonormal basis for this space in general. The problem is here that there exists isotropic vectors (thus such that B(v,v)=0). If we could write

[tex]v=\alpha_1e_1+...+\alpha_ne_n[/tex]

for some orthonormal basis, then

[tex]B(v,v)=\alpha_1^2+...+\alpha_n^2[/tex]

but this can never be 0 (in a real vector space) unless v=0. Of course, the situation can be solved if we consider it as a complex vector space, then the basis outlined above can be modified to be a orthonormal basis.

But then there might be other problems (which don't arise in this example). The form might be degenerate. Thus, there might exist v that is orthogonal to all vectors. We can find an orthogonal basis for this (but we willl have to put v in the orthogonal basis), but not an orthonormal basis...
 
  • #3
Thanks for the reply. Yes, I realize that we won't necessarily get an orthonormal basis; I meant to say orthogonal.

I've managed to get a proof for the countable case. Suppose you have a countable basis [itex]\{e_1,e_2,\ldots\}[/itex], and let [itex]V_n=\mathrm{span}\{e_1,\ldots,e_n\}[/itex]. The strategy is to construct inductively a sequence of bases [itex]\mathcal{B}_n[/itex] for [itex]V_n[/itex], orthogonal with respect to (the restriction of) the form B, with [itex]\mathcal{B}_n\subseteq \mathcal{B}_{n+1}[/itex]. This follows the normal Gram-Schmidt procedure, with the subtlety that you must demonstrate that [itex]V_{n+1}[/itex] contains a vector [itex]v[/itex], not in [itex]V_n[/itex], with [itex]B(v,v)\neq0[/itex]. Then the union of all these bases forms an orthogonal basis for the whole space.

So we've extended the conclusion from finite to countable. But what if there isn't a countable basis? Let S be your favourite set, and V the set of functions from S to F with finite support, with pointwise operations. (Up to isomorphism, I think this is the most general vector space with an algebraic basis.) Is there some symmetric bilinear form on V with no orthogonal basis? I don't see how we can use anything like the same arguments here.

Another similar question (which is probably more relevant to what made me think about this, namely the Killing form on infinite dimensional Lie algebras): what if we have a Banach space and ask the same question, but with basis having the analytic meaning rather than the algebraic one? I haven't thought much about this but I'm guessing it's true again for countable bases, though the proof will require some alteration.
 
  • #4
Hmm, if you consider symmetric bilinear forms on Hilbert spaces. A kind of symmetric bilinear form is this one: <x,Ty> with T a self-adjoint operator. The question that this form is diagonalizable is actually saying that T is diagonalizable. But I don't think that every self-adjoint operator is.

For example, consider [itex]T:L^2[0,1]\rightarrow L^2[0,1][/itex] with [itex]T(f)(x)=xf(x)[/itex]. If I'm correct then this has no eigenvalues, so it's not diagonizable.
The corresponding symmetric bilinear form would be:

[tex]B(f,g)=\int_0^1{xf(x)g(x)}[/tex].

I didn't check all the details, but this could work.

I'm still searching for more satisfactory answers, but I've noticed that there is not really much theory on symmetric bilinear forms on infinite-dimensional spaces...
 
  • #5
micromass said:
I'm still searching for more satisfactory answers, but I've noticed that there is not really much theory on symmetric bilinear forms on infinite-dimensional spaces...
Yes, that's exactly what I'm finding!

For the case of Hilbert spaces, I'm not sure if diagonalisability as an endomorphism [itex]V\to V^*\to V[/itex] (where the first map is the bilinear form and the second the Riesz isomorphism) is necessary for diagonalisability as a form.
For example, consider [itex]T:L^2[0,1]\rightarrow L^2[0,1][/itex] with [itex]T(f)(x)=xf(x)[/itex]. If I'm correct then this has no eigenvalues, so it's not diagonizable.
The corresponding symmetric bilinear form would be:

[tex]B(f,g)=\int_0^1{xf(x)g(x)}[/tex].

As you say, as an endomorphism this has no eigenvectors so it isn't diagonalisable. But from the top of my head I think the functions [itex]\sin(n\pi x)/\sqrt x[/itex] for positive integer n will form an orthogonal basis of the space. So, if I'm right, T is a self-adjoint map with no eigenvalues, but the associated symmetric bilinear form is diagonalisable.

Thanks for your input. I'm not sure we're going to reach any firm conclusions here without disproportionate effort unfortunately..
 

1. What is a symmetric bilinear form on an infinite dimensional space?

A symmetric bilinear form on an infinite dimensional space is a function that takes in two vectors and produces a scalar value. It is linear in each of its arguments and satisfies the property of symmetry, meaning that swapping the order of the arguments does not change the result.

2. How are symmetric bilinear forms represented mathematically?

Symmetric bilinear forms can be represented using matrices or inner products. In the matrix representation, the entries of the matrix correspond to the coefficients of the symmetric bilinear form. In the inner product representation, the symmetric bilinear form is defined as the dot product of two vectors in the infinite dimensional space.

3. What is the significance of symmetric bilinear forms in linear algebra?

Symmetric bilinear forms are important in linear algebra because they can be used to define important concepts such as orthogonality, self-adjointness, and positive definiteness. They also play a crucial role in the spectral theory of self-adjoint operators.

4. How do you determine if a symmetric bilinear form is positive definite?

A symmetric bilinear form is positive definite if it satisfies the condition that for any non-zero vector v, the value of the form on v is always positive. Mathematically, this can be represented as v^TAv > 0, where A is the matrix representation of the form and v^T is the transpose of v.

5. What are some applications of symmetric bilinear forms?

Symmetric bilinear forms have many applications in mathematics, physics, and engineering. They are used in optimization problems, quantum mechanics, and computer graphics. They also have applications in differential geometry, where they are used to define the curvature of surfaces.

Similar threads

Replies
7
Views
782
  • Linear and Abstract Algebra
Replies
26
Views
6K
Replies
3
Views
2K
Replies
14
Views
1K
  • Linear and Abstract Algebra
Replies
2
Views
863
  • Linear and Abstract Algebra
Replies
2
Views
2K
  • Linear and Abstract Algebra
Replies
8
Views
4K
  • Linear and Abstract Algebra
Replies
23
Views
1K
  • Linear and Abstract Algebra
Replies
3
Views
965
  • Linear and Abstract Algebra
Replies
1
Views
799
Back
Top