Orthogonal polynomials are perpendicular?

  • Thread starter A Dhingra
  • Start date
  • #1
211
0
Orthogonal polynomials are perpendicular??

hi..

So as the title suggests, i have a query regarding orthogonal polynomials.
What is the problem in defining orthogonality of polynomials as the tangent at a particular x of two polynomials are perpendicular to each other, for each x? This simply follows the perpendicular vectors or planes etc.

What was the need of defining orthogonality as inner product of the two polynomials are zero? Inner product is given as
∫w(x)*f1(x)*f2(x)=0 over a define interval (that determines the limit)
where w(x)is called the weight function, w(x)>0 for all x in the given interval.

can someone explain what this inner product begin zero mean geometrically, if possible?
(please pardon me for asking a geometrical explanation on this Abstract & linear maths forum.)
 

Answers and Replies

  • #2
35,441
11,876
What is the problem in defining orthogonality of polynomials as the tangent at a particular x of two polynomials are perpendicular to each other, for each x?
That would give a very boring definition of "perpendicular" - only linear, non-constant polynomials could be perpendicular to each other.
 
  • #3
pwsnafu
Science Advisor
1,080
85
Think about why we care about perpendicular vector in the first place.
 
  • #4
211
0
That would give a very boring definition of "perpendicular" - only linear, non-constant polynomials could be perpendicular to each other.
Can we not have polynomials whose tangents are always perpendicular to each other, apart from straight lines (formed by linear polynomials)?
 
  • #5
211
0
Think about why we care about perpendicular vector in the first place.
Definitely you are trying to point out that being 90° (as geometrically it is no special) to each other is not why we care about perpendicular vectors.. they are linearly independent, that is, they can form the basis of a vector space. Same is true for orthogonal polynomials.( I am reminded that I have studied vector spaces this year)

Can you explain how does the definition of inner product lead us to this linear independence of orthogonal polynomials?(if it does...)
 
  • #6
AlephZero
Science Advisor
Homework Helper
6,994
293
Can you explain how does the definition of inner product lead us to this linear independence of orthogonal polynomials?(if it does...)

It doesn't matter what inner product you use. It only depends on the general properties of inner products.

Write "." to mean "inner product", and (because I'm too lazy to type the general case in LaTeX!) consider just three orthogonal polynomials ##p_1##, ##p_2##, ##p_3##.

Suppose ##p_3## is linearly dependent on ##p_1## and ##p_2##.

##p_3 = a_1 p_1 + a_2 p_2##
So
##p_3.p_1 = a_1 p_1.p_1 + a_2 p_2.p_1##

If the polynomials are orthogonal,

##p_3.p_1 = 0## and ##p_2.p_1 = 0##

So

##0 = a_1 p_1.p_1 + 0##, i.e. ##a_1 = 0##.

Similarly ##a_2## = 0.

So ##p_3 = 0## which is a contradiction.
 
  • #7
35,441
11,876
Can we not have polynomials whose tangents are always perpendicular to each other, apart from straight lines (formed by linear polynomials)?
The slopes of the tangents on f and g are the derivatives f' and g'. Orthogonality requires that the product of those slopes is -1: ##f'(x)g'(x)=-1## or ##f'(x)=\frac{1}{g'(x)}##. The left side is a polynomial, therefore the right side has to be one as well. 1/polynomial is a polynomial only of g' is constant, therefore g is linear. With the same argument, f is linear as well.
To be orthogonal with your proposed definition, both polynomials have to be straight, orthogonal lines.
 
  • #8
22,089
3,297
Maybe he meant that the tangent vectors need to be orthogonal only where the two functions intersect. That would certainly be a very geometric thing to do. But it's also useless since there are going to be very little applications of such a thing.

The reason that we care about orthogonality as defined by integrals is because it is a useful concept. It shows up a lot in solving differential equations and in many other situations.

I'll admit that it is confusing to call f and g "orthogonal" since that would imply some geometric picture about f and g. The only reason we call them orthogonal is because of the analogy in the finite-dimensional case where it really does have a geometric picture. We keep the same name in the infinite-dimensional case. It's maybe not the best choice of words, but you'll have to get used to it.
 
  • #9
211
0
The left side is a polynomial, therefore the right side has to be one as well. 1/polynomial is a polynomial only of g' is constant, therefore g is linear. With the same argument, f is linear as well.
To be orthogonal with your proposed definition, both polynomials have to be straight, orthogonal lines.

As you have argued g' when a constant can give a polynomial 1/g', but can't we use expansion methods of a polynomial of n degrees with negative power..(something like binomial expansion of negative powers). Using that 1/g' can be a polynomial too.
For example, let us say g' = e^x (it can be approximated by a polynomial let that be g' instead)
then 1/g' = e^(-x)..which can be approximated by another polynomial. Can this work?
(please pardon me if this is silly argument)
 
  • #10
AlephZero
Science Advisor
Homework Helper
6,994
293
The thread title says "polynomials". You are now talking about functions that are not polynomials.

Sure, orthogonal functions which are not polynomials are often used in math (for example Fourier series), but we can only read what you wrote, not what you meant!
 
  • #11
211
0
the Maclaurin series for e^x is
Ʃ(x^k/k!)

(summation from k=0 to ∞, over x^k/k!)

and e^(-x) is
summation over ((-1)^k*x^k)/k!

these both are polynomials aren't they?
can't we use such approximating polynomials when talking about regular polynomials?
 
  • #12
pwsnafu
Science Advisor
1,080
85
the Maclaurin series for e^x is
Ʃ(x^k/k!)

(summation from k=0 to ∞, over x^k/k!)

and e^(-x) is
summation over ((-1)^k*x^k)/k!

these both are polynomials aren't they?

Those are series, not polynomials. Polynomials have finite number of terms.
 
  • #13
22,089
3,297
the Maclaurin series for e^x is
Ʃ(x^k/k!)

(summation from k=0 to ∞, over x^k/k!)

and e^(-x) is
summation over ((-1)^k*x^k)/k!

these both are polynomials aren't they?
can't we use such approximating polynomials when talking about regular polynomials?

A polynomial would be a finite sum. So we can approximate ##e^x## be polynomials to arbitrary accuracy, but it's not a polynomial itself because the sum is infinite.
 
  • #14
211
0
A polynomial would be a finite sum. So we can approximate ##e^x## be polynomials to arbitrary accuracy, but it's not a polynomial itself because the sum is infinite.
Thanks for mentioning this. I thought with any number of terms (finite or infinite )can be called a polynomial, which is not so..

So the conclusion is that orthogonal polynomials are just linearly independent with their inner product being begin, and they do not have much of geometrical interpretation in general.

Thanks everyone.
 

Related Threads on Orthogonal polynomials are perpendicular?

  • Last Post
Replies
2
Views
2K
  • Last Post
Replies
1
Views
2K
Replies
2
Views
1K
  • Last Post
Replies
2
Views
5K
Replies
3
Views
11K
Replies
2
Views
3K
Replies
2
Views
6K
Replies
1
Views
2K
  • Last Post
Replies
2
Views
5K
Top