Whats the point of a function being orthogonal?

Engineerbrah
Messages
5
Reaction score
0
I understand that a function is orthogonal if the inner product of any two functions of an infinite series equal to zero.

My question is why do we prove functions are orthogonal? What can we do with this information?
 
Physics news on Phys.org
That depends on the topic, there are many applications. Quantum mechanics relies heavily on orthogonal functions, for example, but many fields do that.
 
mfb said:
That depends on the topic, there are many applications. Quantum mechanics relies heavily on orthogonal functions, for example, but many fields do that.

Well I did post this in the maths section. Could you elaborate particularly what the point of orthogonality of Fourier series and Sturm-Liouville problems implies?
 
For a Fourier series you calculate the coefficients using the fact that the basis functions are orthogonal.
 
Engineerbrah said:
I understand that a function is orthogonal if the inner product of any two functions of an infinite series equal to zero.

My question is why do we prove functions are orthogonal? What can we do with this information?

Two vectors are orthogonal if their inner product vanishes.

Why do we care about basis vectors being orthogonal? It makes it trivial to find the components: if \{e_1, \dots, e_n\} are orthogonal then <br /> v = \sum_{k = 1}^n \frac{\langle v, e_k \rangle}{\langle e_k, e_k \rangle}e_k. If the basis vectors were not orthogonal it would be necessary to solve a system of n simultaneous equations to find the components.

Functions in this context are vectors, with the complication that the dimension of the space is infinite. Do you want to have to solve a countably infinite system of simultaneous equations to find the components with respect to some basis?
 
  • Like
Likes Engineerbrah
pasmith said:
Two vectors are orthogonal if their inner product vanishes.

Why do we care about basis vectors being orthogonal? It makes it trivial to find the components: if \{e_1, \dots, e_n\} are orthogonal then <br /> v = \sum_{k = 1}^n \frac{\langle v, e_k \rangle}{\langle e_k, e_k \rangle}e_k. If the basis vectors were not orthogonal it would be necessary to solve a system of n simultaneous equations to find the components.

Functions in this context are vectors, with the complication that the dimension of the space is infinite. Do you want to have to solve a countably infinite system of simultaneous equations to find the components with respect to some basis?

Yes! Thank you. This is the answer I was looking for.
 
Thread 'Direction Fields and Isoclines'
I sketched the isoclines for $$ m=-1,0,1,2 $$. Since both $$ \frac{dy}{dx} $$ and $$ D_{y} \frac{dy}{dx} $$ are continuous on the square region R defined by $$ -4\leq x \leq 4, -4 \leq y \leq 4 $$ the existence and uniqueness theorem guarantees that if we pick a point in the interior that lies on an isocline there will be a unique differentiable function (solution) passing through that point. I understand that a solution exists but I unsure how to actually sketch it. For example, consider a...
Back
Top