# Orthornormal basis in L^([a,b])

1. Nov 3, 2008

### tkjacobsen

1. The problem statement, all variables and given/known data
$(e_n)$ is orthonormal basis for $L^2([0,1])$.

Want to show that $(f_n)$ is basis for $L^2([a,b])$ when $f_n(u) = (b-a)^{-1/2}e_n(\frac{u-a}{b-a})$

2. Relevant equations
$f_n(u) = (b-a)^{-1/2}e_n(\frac{u-a}{b-a})$

3. The attempt at a solution
I did show that $(f_n)$ is an orthonormal sequence. But how can I show that it is also a basis...

2. Nov 3, 2008

### HallsofIvy

Since they are orthonormal they are necessarily independent so you only need to show that they span the space. Given any f in L2([a, b]) you need to show that f is equal to $\sum a_n f_n$. Use your "relevant equation" to rewrite that in terms of en and use the fact that {en} spans L2([0, 1]).

3. Nov 3, 2008

### tkjacobsen

but then i get
$\sum a_n f_n(u) = (b-a)^{-1/2}\sum a_n e_n( (u-a)/(b-a) )$

Can I then just say that since $e_n$ spans $L^2([0,1])$ then $e_n( (u-a)/(b-a))$ spans $L^2([a,b])$ so the above equals $f$ for the right choice of $a_n$.

Or am I missing something

Thanks