A property of Chebyshev polynomials

In summary, Chebyshev polynomials have the property (T_i o T_j)(x) = (T_j o T_i)(x) = T_ij(x) when x is in ]-inf;+inf[. This can be easily proven using the properties of cos(nx) and cosh(nx). However, when x is any real, there may not be a connection to the properties of cosine or hyperbolic cosine, and therefore it is not clear if these polynomials can still be considered as Chebyshev polynomials. The relationship between Chebyshev polynomials and Lucas Sequences is also being explored, with C_n appearing in the Lucas Lehmer Test for Mersenne numbers. The properties of C
  • #1
T.Rex
62
0
Hi,
I fail finding a proof (even in MathWorld, in my Mathematic dictionary or on the Web) for the following property of Chebyshev polynomials:

(T_i o T_j)(x) = (T_j o T_i)(x) = T_ij(x) when x is in ] -inf ; + inf [

Example :
T_2(x) = 2x^2-1
T_3(x) = 4x^3-3x
T_3(T_2(x)) = T_2(T_3(x)) = T_6(x)

When x is in ]-1 ; +1[ , it seems easy to use x=cosA and T_n(cosA) = cos(nA) .

But how to do when x is any real ?

Thanks,
Tony
 
Mathematics news on Phys.org
  • #2
But how to do when x is any real ?

You could probably do something clever by letting A be any complex number!


Try doing it with induction.
 
  • #3
Hyperbolic Cosinus ?

Huuumm,
I did not find with A being any complex number.
But I thought to use cosh (Hyperbolic Cosinus). It has the same properties than Cosinus about cos(nx). But, with x = cosh(A), we have both A in ]-inf;+inf[ and x in ]-inf;+inf[ . So it's less general than A being complex, but it answers my needs.
The proof:
We define: x=cosh(A) , T_n(x) = T_n(cosh(A)) = cosh(nA) .
So (T_i o T_j)(x) = T_i(cosh(jA)) = cosh(ijA) = (T_j o T_i)(x) = T_[ij](x) .

Now, I have another series of Polynomials that are very close to Chebyshev polynomials:
S_0(x) = 2
S_1(x) = x
S_i(x) = x.S_[i-1](x) - S_[i-2](x) (as with T_n(x) ).
So: S_2(x) = x^2 - 2 , S_3(x) = x^3 - 3x .
It seems to have the same property: S_i o S_j = S_j o S-_i = S_[ij] .
But there is now no way to connect S_n(x) with the properties of cos(nx) or of cosh(nx) ...
I just noticed that, if S_n(x) = Sum{ a_i . x^i } , then T_n(x) = Sum{ 2^(i-1) . a_i . x^i } , where i=0..n .

This series is used in order to prove certain properties of Lucas Sequences, helping to prove the famous LLT (Lucas Lehmer Test): L(0)=4 , L(i) = L(i-1)^2 - 2 , M_q=2^q-1 is prime iff M_q divides L(q-2).

Tony
 
  • #4
I did not find with A being any complex number.
But I thought to use cosh

That's essentially what I was suggesting; the neat thing about cos and cosh is that they're related via cosh x = cos ix. (And also cosh ix = cos x)


Here's another, fairly powerful trick that I wasn't able to remember when I wrote my post:

If f(x) is a polynomial, it either has a finite number of zeroes, or it is the zero polynomial.

You were able to show, using the relationship with cosines, that the polynomial (T_j o T_i)(x) - T_ij(x) = 0 for all x in the interval [-1, 1]. This implies this polynomial must be zero for all x!



I just noticed that, if S_n(x) = Sum{ a_i . x^i } , then T_n(x) = Sum{ 2^(i-1) . a_i . x^i } , where i=0..n .

So, in other words. T_n(x) = (1/2) * S_n(2x)?
 
  • #5
It is not clear to me that if you move outside of the region [-1,1] that you are still dealing with a Chebyshev polynomial. The definition:
[tex] T_k(Cos \Theta ) = Cos k \Theta [/tex]

results in properties used to prove bounds on the resulting errors. If you do not have the cosine then all bets are off on the errors of the resulting function.

The main result of the Chebyshev Polynomial is to obtian the Chebyshev points. These are the points which should be used for ANY polynomial approximation to a function. Just for interest sake the expanded Chebyshev points for and arbritray interval [a,b] are given by:

[tex]2 x_i = a + b + (a-b) \frac { Cos ( \frac {2i+1} {2n +2} \pi)} {Cos( \frac {\pi} {2n +2})}[/tex]

Reference:
Elementary Numerical Analysis: An Algorithmic Approach by Conte and De Boor
 
Last edited:
  • #6
Chebyshev polynomials for Lucas Sequences (LLT).

Thanks Hurkyl !
Yes you're right !
Hurkyl said:
So, in other words. [tex]T_n(x) = \frac{1}{2} S_n(2x) [/tex] ?
That works fine !
So it proves that [tex]C_n[/tex] has the same properties than [tex]T_n[/tex] has !
Tony
 
  • #7
x in ]-inf ; + inf[ or x in ]-1 ; +1[ ??

Hi Integral,
Integral said:
It is not clear to me that if you move outside of the region [-1,1] that you are still dealing with a Chebyshev polynomial. The definition: [tex] T_k(Cos \Theta ) = Cos k \Theta [/tex] results in properties used to prove bounds on the resulting errors. If you do not have the cosine then all bets are off on the errors of the resulting function.
Yes, this is a good question ! My mathematical dictionary also defines Cebyshev polynomials by means of [tex]x=Cos \Theta[/tex] . So, can we still talk of Chebyshev polynomials when x is any real ?

My goal is to understand the relationships between the Chebyshev polynomials and the Lucas Sequences: [tex] V_n = P V_{n-1} - Q V_{n-2} [/tex] . I'm reading the book of Mr H. C. Williams : "Edouard Lucas and primality testing". Page 77, Mr Williams introduces the Chebyshev polynomials ([tex]C_n[/tex] and [tex]S_n[/tex], not the [tex]T_n[/tex] ! ) as a tool for building properties of the Lucas Sequences ([tex]U_n[/tex] and [tex]V_n[/tex]). He does not say what are the values of [tex]x[/tex] ...

(It appears that Chebyshev polynomials and Lucas Sequence share several properties. As an example: [tex]V_{kn} = V_n V_{(k-1)n} - Q^n V_{(k-2)n} [/tex] produces relationships very close (with k=2,3,...) to the ones of Chebyshev polynomials:[tex]V_{2n}=V_n^2-2Q^n[/tex] ...)

The interesting part is that [tex]C_2(x) = x^2-2[/tex] appears in the Lucas Lehmer test and that [tex]C_3(x) = x^3-3x[/tex] enables to find other valid values of [tex]S_0[/tex] than 4 . (Example, [tex]C_3(4)=52\equiv -10\pmod{M_5}[/tex] . And, with [tex]L_0=-10, L_3=0[/tex] proving [tex]M_5[/tex] is prime.).

Reminder: The Lucas Lehmer Test (LLT) for Mersenne numbers is:
[tex] L_0=4 , L_i = L_{i-1}^2 - 2=C_2(L_{i-1}) ; M_q=2^q-1 [/tex] is prime iff [tex] L_{q-2} \equiv 0\pmod{M_q}[/tex] .

My questions are: 1) why [tex]C_3[/tex] produces numbers A (like 4) such that [tex]2+A[/tex] is a quadratic non residue used as [tex]L_0[/tex] ? 2) What are the properties of [tex]C_n[/tex] polynomials about Lucas Sequences when n is a prime ?

Tony

I think what I call here as [tex]C_n[/tex] is what I called as [tex]S_n[/tex] in the previous posts ... [tex]C_n[/tex] is the name used by Mr Williams.
[tex]C_0(x) = 2 ,
C_1(x) = x ,
C_i(x) = x C_{i-1}(x) - C_{i-2}(x) [/tex] .
 
Last edited:

1. What are Chebyshev polynomials?

Chebyshev polynomials, named after Russian mathematician Pafnuty Chebyshev, are a set of orthogonal polynomials used in mathematics and engineering. They are defined by the recurrence relation Tn(x) = cos(n arccos(x)), where n is the degree of the polynomial and x is the variable.

2. What is the significance of Chebyshev polynomials?

Chebyshev polynomials have many applications in mathematics and engineering, including approximation of functions, solving differential equations, and in the field of numerical analysis. They also have connections to other areas of mathematics, such as number theory and combinatorics.

3. What is the main property of Chebyshev polynomials?

The main property of Chebyshev polynomials is that they are orthogonal, meaning that the integral of the product of two different polynomials with respect to the weight function 1/√(1-x2) is equal to 0. This property allows for efficient calculations and applications in various fields.

4. Can Chebyshev polynomials be used to approximate any function?

Yes, Chebyshev polynomials can be used to approximate any continuous function on the interval [-1,1]. This property is known as the Weierstrass approximation theorem and is a fundamental result in mathematical analysis.

5. Are there any limitations or drawbacks to using Chebyshev polynomials?

One limitation of Chebyshev polynomials is that they can only approximate functions on the interval [-1,1]. Additionally, the accuracy of the approximation depends on the number of terms used in the polynomial, so a higher degree polynomial may be needed for more precise approximations. There are also other types of orthogonal polynomials that may be better suited for certain applications.

Similar threads

  • General Math
Replies
1
Views
1K
  • General Math
Replies
6
Views
1K
  • General Math
Replies
2
Views
1K
Replies
4
Views
2K
  • Linear and Abstract Algebra
Replies
1
Views
919
  • Set Theory, Logic, Probability, Statistics
Replies
27
Views
3K
  • Calculus and Beyond Homework Help
Replies
6
Views
1K
Replies
2
Views
620
  • MATLAB, Maple, Mathematica, LaTeX
Replies
4
Views
2K
  • General Math
Replies
2
Views
2K
Back
Top