Repeated substitution and Chebyshev polynomials

2020-01-31 (2020-01-26)

Recently, I was experimenting with a process in which we start with the polynomial f0(x)=xf_0(x) = x and obtain fn(x)=fn1(1sx)f_n(x) = f_{n-1} \left(\frac{1}{s-x}\right) by substituting x1sxx \mapsto \frac{1}{s-x}, for some number sCs \in \mathbb C. By accident, I saw that both for s=1s=1 and s=1s=-1, f3=f0f_3 = f_0. Of course, for s=0s=0, f2=f0f_2 = f_0 as well. This made me wonder: for which values of ss do we have fn=f0f_n = f_0 for some n>0n>0?

The first few polynomials

We have

f0=x1;f1=1sx;f2=sxs21sx;f3=s21sxs32s(s21)x;f4=s32s(s21)xs43s2+1(s32s)x.\begin{aligned} f_0 &= \frac{x}{1};\\ f_1 &= \frac{1}{s-x};\\ f_2 &= \frac{s-x}{s^2 - 1 - sx};\\ f_3 &= \frac{s^2 - 1 - sx}{s^3 - 2s - (s^2 - 1)x};\\ f_4 &= \frac{s^3 - 2s - (s^2 - 1) x}{s^4 - 3 s^2 + 1 - (s^3 - 2s) x}. \end{aligned}

Now we can calculate that f4=xf_4 = x means that

s32s(s21)x=(s43s2+1)x(s32s)x2s^3 - 2s - (s^2 - 1) x = (s^4 - 3 s^2 + 1) x - (s^3 - 2s) x^2

so the terms with x2x^2 on the left hand side (these terms amount to 00) must match the terms with x2x^2 on the right hand side (they amount to (s32s)-(s^3 - 2s)), the terms with xx on the left must match the ones on the right and the same for the constant terms. This gives the following equations:

0=s32s(s21)=s43s2+1s32s=0,0 = s^3 - 2s \land -(s^2 - 1) = s^4 - 3 s^2 + 1 \land s^3 - 2 s = 0,

s32s=0s(s32s)(s21)=(s21),s^3 - 2 s = 0 \land s(s^3 - 2s) - (s^2 - 1) = -(s^2 - 1),

s32s=0,s^3 - 2 s = 0,

s=0s=2s=2.s = 0 \lor s = \sqrt{2} \lor s = -\sqrt{2}.

We could have expected the s=0s = 0, since f4=f2(f2)f_4 = f_2(f_2) and if s=0s = 0, f2=f0f_2 = f_0, so f4(x)=f2(f2)=f2(f0)=f0f_4(x) = f_2(f_2) = f_2(f_0) = f_0.

More generally, if a number dd divides nn, and fd=f0f_d = f_0 for some value s=s0s=s_0, then fn=f0f_n = f_0 for s=s0s=s_0 as well.

An easier description

Notice that in the equations above, the denominator of fnf_n matches the numerator of fn+1f_{n+1} and the term with xx of the denominator matches the constant term of the numerator. This provides us with an easier way to describe the polynomials fnf_n.

More formally, we can define a sequence of polynomials gng_n by g2=1g_{-2} = -1, g1=0g_{-1} = 0 and gn=sgn1gn2g_n = s g_{n-1} - g_{n-2}, so

g2=1;g1=0;g0=1;g1=s;g2=s21;g3=s32s;g4=s43s2+1;g5=s54s3+3s;g6=s65s4+6s21;\begin{aligned} g_{-2} &= -1;\\ g_{-1} &= 0;\\ g_0 &= 1;\\ g_1 &= s;\\ g_2 &= s^2-1;\\ g_3 &= s^3 - 2 s;\\ g_4 &= s^4 - 3 s^2 + 1;\\ g_5 &= s^5 - 4 s^3 + 3 s;\\ g_6 &= s^6 - 5 s^4 + 6 s^2 - 1; \end{aligned}

We will now claim that

fn=gn1gn2xgngn1x.f_n = \frac{g_{n-1} - g_{n-2} x}{g_n - g_{n-1} x}.

Indeed,

f0=x1=0(1)x10x=g1g2xg0g1x.f_0 = \frac{x}{1} = \frac{0 - (-1) x}{1 - 0 x} = \frac{g_{-1} - g_{-2} x}{g_0 - g_{-1} x}.

Also, if the claim holds for fnf_n, then

fn+1=gn1gn21sxgngn11sx=(sx)gn1gn2(sx)gngn1=sgn1gn2gn1xsgngn1gnx=gngn1xgn+1gnx.f_{n+1} = \frac{g_{n-1} - g_{n-2} \cdot \frac{1}{s-x}}{g_n - g_{n-1} \cdot \frac{1}{s-x}} =\frac{(s-x) g_{n-1} - g_{n-2}}{(s-x) g_n - g_{n-1}} =\frac{s g_{n-1} - g_{n-2} - g_{n-1} x}{s g_n - g_{n-1} - g_n x} = \frac{g_n - g_{n-1} x}{g_{n+1} - g_n x}.

The second equality is obtained by multiplying numerator and denominator by sxs - x. This shows that the claim holds for fn+1f_{n+1} as well and we conclude by induction that the claim holds for all fnf_n.

Finding a couple more values of ss

If we want fn=f0=xf_n = f_0 = x, we must have

gn1gn2xgngn1x=x,\frac{g_{n-1} - g_{n-2} x}{g_n - g_{n-1} x} = x,

gn1gn2x=gnxgn1x2,g_{n-1} - g_{n-2} x = g_n x - g_{n-1} x^2,

gn1=0gn2=gn=sgn1gn2,g_{n-1} = 0 \land - g_{n-2} = g_n = s g_{n-1} - g_{n-2},

gn1=0sgn1=0,g_{n-1} = 0 \land s g_{n-1} = 0,

gn1=0.g_{n-1} = 0.

For example, f5=f0f_5 = f_0 iff

0=s43s2+s=(s2+s1)(s2s1)0 = s^4 - 3 s^2 + s = (s^2 + s - 1)(s^2 - s - 1)

which has solutions s=±12±125s = \pm \frac{1}{2} \pm \frac{1}{2} \sqrt{5}. It struck me that 12+125\frac{1}{2} + \frac{1}{2} \sqrt{5} is exactly the diagonal length of a pentagon, whilst 2\sqrt{2}, a root of g3g_{3}, is the diagonal length of a square and 11, a root of g2g_{2}, is the side length of a triangle, so again the distance between vertices that are two edges apart. It seems that cosπn\cos{\frac{\pi}{n}} is always a root of gn1g_{n-1}. We will prove this and more in the next two sections.

Chebyshev polynomials

In this section, we will show that our polynomials gng_n are almost equal to the so-called "Chebyshev polynomials of the second kind". Knowing this, and the way that these polynomials are defined, will provide us with a neat description of the ss for which of the gn(s)=0g_n(s) = 0.

Firstly, recall two trigonometric identities. The angle sum identity gives

sin((n+1)θ)=sin(θ+nθ)=sin(θ)cos(nθ)+cos(θ)sin(nθ),\sin((n + 1)\theta) = \sin(\theta + n \theta) = \sin(\theta) \cos(n \theta) + \cos(\theta) \sin(n \theta),

which we will call (1), and the product-to-sum identity gives

2cos(nθ)sin(θ)=sin(nθ+θ)sin(nθθ)=sin((n+1)θ)sin((n1)θ),2 \cos(n \theta) \sin(\theta) = \sin(n \theta + \theta) - \sin(n \theta - \theta) = \sin((n + 1) \theta) - \sin((n - 1) \theta),

which we will call (2). Combining these, we obtain

2sin((n+1)θ)=2sin(θ)cos(nθ)+2cos(θ)sin(nθ)=sin((n+1)θ)sin((n1)θ)+2cos(θ)sin(nθ)\begin{aligned} 2 \sin((n + 1)\theta) &= 2 \sin(\theta) \cos(n \theta) + 2 \cos(\theta) \sin(n \theta)\\ &= \sin((n + 1) \theta) - \sin((n - 1) \theta) + 2 \cos(\theta) \sin(n \theta) \end{aligned}

or

sin((n+1)θ)=2cos(θ)sin(nθ)sin((n1)θ),\sin((n+1) \theta) = 2 \cos(\theta) \sin(n \theta) - \sin((n - 1) \theta),

which we will call (3).

We will now define polynomials UnU_n by

Un(cos(θ))=sin((n+1)θ)sinθ.U_n(\cos(\theta)) = \frac{\sin((n+1) \theta)}{\sin \theta}.

These are the so-called "Chebyshev polynomials of the second kind". It is not at all obvious that these are actually polynomials, but we will show that now. We have

U0(cos(θ))=sin(θ)sin(θ)=1U_0(\cos(\theta)) = \frac{\sin(\theta)}{\sin(\theta)} = 1

and, by (1),

U1(cos(θ))=sin(2θ)sin(θ)=sin(θ)cos(θ)+cos(θ)sin(θ)sin(θ)=2cos(θ).U_1(\cos(\theta)) = \frac{\sin(2 \theta)}{\sin(\theta)} = \frac{\sin(\theta)\cos(\theta) + \cos(\theta) \sin(\theta)}{\sin(\theta)} = 2 \cos(\theta).

so U0(x)=1U_0(x) = 1 en U1(x)=2xU_1(x) = 2x. Now we can use (3) to find a recurrence relation for UnU_n:

Un(cos(θ))=sin((n+1)θ)sin(θ)=2cos(θ)sin(nθ)sin((n1)θ)sin(θ)=2cos(θ)Un1(θ)Un2(θ),U_n(\cos(\theta)) = \frac{\sin((n+1)\theta)}{\sin(\theta)} = \frac{2\cos(\theta) \sin(n \theta) - \sin((n - 1)\theta)}{\sin(\theta)} = 2 \cos(\theta) U_{n-1}(\theta) - U_{n-2}(\theta),

or

Un(s)=2xUn1(s)Un2(s).U_n(s) = 2 x U_{n-1}(s) - U_{n-2}(s).

We can now claim that for all n,

gn(s)=Un(s2).g_n(s) = U_n\left(\frac{s}{2}\right).

Indeed, g0(s)=1=U0(s)g_0(s) = 1 = U_0(s) and g1(s)=s=2s2=U1(s2)g_1(s) = s = 2 \cdot \frac{s}{2} = U_1(\frac{s}{2}). Now, suppose the claim holds for all gmg_m with 0mn0 \leq m \leq n. Then

gn(s)=sgn1(s)gn2(s)=2s2Un1(s2)Un2(s2)=Un(s2).g_n(s) = s g_{n-1}(s) - g_{n-2}(s) = 2 \cdot \frac{s}{2} U_{n-1}\left(\frac{s}{2}\right) - U_{n-2}\left(\frac{s}{2}\right) = U_n\left(\frac{s}{2}\right).

The zeroes of the Chebyshev polynomials

By definition, Un1(cos(θ))=sin(nθ)sinθU_{n-1}(\cos(\theta)) = \frac{\sin(n \theta)}{\sin \theta}, so if we want Un1(12s)=0U_{n-1}(\frac{1}{2}s) = 0 and 12s=cos(θ)\frac{1}{2} s = \cos(\theta), we must have that sin(nθ)=0\sin(n \theta) = 0, so nθ=mπn \theta = m \pi for some integer 0mn0 \leq m \leq n. We must also have that sin(θ)\sin(\theta) is not zero, so 0<m<n0 < m < n.

s=2cos(mπn).s = 2 \cos\left(\frac{m \pi}{n}\right).

Conclusion

To recap: we have shown that f0=fnf_0 = f_n if some polynomial that we defined, gn1g_{n-1}, is zero in ss. Since gn1(s)=Un1(12s)g_{n-1}(s) = U_{n-1}\left(\frac{1}{2} s\right), this happens if Un1(s2)=0U_{n-1}\left(\frac{s}{2}\right) = 0, which happens exactly when

s=2cos(mπn).s = 2 \cos\left(\frac{m \pi}{n}\right).