3. Calculating Taylor coefficients

We start with some easy ones: polynomials should be quite trivial because the Taylor coefficients will just be the coefficients of the polynomial! Then we move on to more complicated functions.

We will mostly work with Taylor series centered at 0. Remember that they can be centered around any other number, but we rarely do that.

Remember that the general formulation of the Taylor coefficients is that we have a power series:

\[\begin{split}f(x) = \sum_{k=0}^{\infty} a_k x^k \\\end{split}\]

and the coefficients \({a_k}\) are given by:

\[a_k = \frac{f^{(k)}(0)}{k!}\]

where \(f^{(k)}\) is the “order k derivative of the function \(f(x)\), evaluated at \(x = 0\)”, and in particular \(f^{(0)}\) (the zeroth derivative) is just \(f(0)\).

3.1. Quick review of derivatives

Remember that we have three notations for derivative. When we first learn we usually use the Lagrange notation: \(f'(x)\) for the first derivative, and \(f^{(k)}(x)\) for the order-k derivative.

Later we also learn the Leibniz notation: \(\frac{df(x)}{dx}\), or \(\frac{df(t)}{dt}\) if it’s with respect to time. One sometimes abbreviates to \(\frac{df}{dx}\) or \(\frac{df}{dt}\).

In physics one often uses the Newton notation for derivatives. If \(x(t)\) is a position as a function of time, then \(\dot{x} = \frac{dx}{dt}\) is the derivative of that position with respect to time. Thus we get \(v = \dot{x}\) and \(a = \dot{v} = \ddot{x}\).

We will liberarlly jump from one type of notation to another according to the situation.

\[f'(x) = \lim_{h \rightarrow 0} \frac{f(x + h) - f(x)}{h}\]

So looking at \(f(x) = x^2\) we have:

\[\begin{split}\frac{dx^2}{dx} \;\; = & \lim_{h \rightarrow 0} \frac{(x + h)^2 - x^2}{h} & \\ = & \lim_{h \rightarrow 0} \frac{x^2 + 2xh + h^2 - x^2}{h} & \\ = & \lim_{h \rightarrow 0} \frac{2xh + h^2}{h} & \\ = & \lim_{h \rightarrow 0} 2x + h & \\ = & 2 x &\end{split}\]

Similar work expanding higher order polynomials gives us:

\[\frac{dx^n}{dx} = n x^{n-1}\]

The other two things to remember about the basic properties of derivatives are:

\[\begin{split}\frac{d \left(A f(x)\right)}{dx} & = A \frac{df(x)}{dx} & \\ \frac{d \left(f(x) + g(x)\right)}{dx} & = \frac{df(x)}{dx} + \frac{dg(x)}{dx} &\end{split}\]

These two properties (of sum and multiplication by a constant) establish that the derivative is a linear operator. You can derive them quite easily by simply writing down the definition of derivative.

Another thing to notice is what happens when we take repeated derivatives of a power of x. Please note the pattern in the following:

\[\begin{split}\frac{d x^3}{dx} & = 3 x^2 & \\ \frac{d^2 x^3}{dx^2} & = 3 \times 2 \times x^1 & \\ \frac{d^3 x^3}{dx^3} & = 3 \times 2 \times 1 \times x^0 & \\ & = 3 \times 2 \times 1 \times 1 & \\ & = 3!\end{split}\]

This generalizes to:

\[\frac{d^n x^n}{dx^n} = n \times (n-1) \times \dots \times 1 = n!\]

3.2. Taylor coefficients for polynomials

Here we look at a few polynomials, and calculate their Taylor coefficients.

\[f(x) = 3 x + 7\]

The coefficients \({a_k}\) will be:

\[\begin{split}a_0 & = \frac{f(0)}{0!} & = \frac{f(0)}{1} & = 3 \times 0 + 7 = 7 \\ a_1 & = \frac{f'(0)}{1!} & = \frac{3 \times x^0}{1} & = 3 \\ a_2 & = \frac{f''(0)}{2!} & = \frac{0}{2\times 1} & = 0\end{split}\]

The second and higher derivatives are all zero, so our coefficients are \({7, 3, 0, 0, ...}\) and the Taylor series is:

\[f(x) = 7 x^0 + 3 x^1 + 0 x^2 + \dots = 7 + 3 x\]

which is exactly correct.

You can see where this is going. If \(f(x) = x^4 - 3 x^3 + 2 x^2 - 4\) then the Taylor coefficients will be \({-4, 0, 2, -3, 1, 0, 0, ...}\) and once again we recover the exact same polynomial we started with:

\[f(x) = -4 x^0 + 0 x^1 + 2 x^2 - 3 x^3 + 1 x^4 = -4 + 2x^2 - 3 x^3 + x^4\]

3.3. Taylor coefficients for sin() and cos()

A review of the derivatives for \(sin(x)\) and \(cos(x)\). It’s useful, in class, to draw \(sin(x)\) and \(cos(x)\) together so that we see that they are each other’s slope.

\[\begin{split}\frac{d \sin(x)}{dx} = \cos(x) \\ \frac{d \cos(x)}{dx} = - \sin(x)\end{split}\]

The second derivatives are then interesting:

\[\begin{split}\frac{d^2 \sin(x)}{dx^2} = - \sin(x) \\ \frac{d^2 \cos(x)}{dx^2} = - \cos(x)\end{split}\]

so the second derivative is the negative of the original function.

Also note that if you have a constant multiplying \(x\) in there: \(\sin(C x)\) then:

\[\begin{split}\frac{d \sin(\omega x)}{dx} = \omega \cos(C x) \\ \frac{d \cos(\omega x)}{dx} = - \omega \sin(C x)\end{split}\]

and:

(3.3.1)\[\begin{split}\frac{d^2 \sin(\omega x)}{dx^2} = - \omega^2 \sin(C x) \\ \frac{d^2 \cos(\omega x)}{dx^2} = - \omega^2 \cos(C x)\end{split}\]

These derivatives are important to remember as they come up when we solve the differential equation for the harmonic oscillator in physics - a ubiquitous phenomenon.

So let us evaluate the various order derivatives of the the sin function at zero. If \(f(x) = \sin(x)\):

\[\begin{split}& f(0) &=& \;\; & \sin(0) & = \;\;\; & 0 \\ & f'(0) &=& & \cos(0) & = & 1 \\ & f''(0) &=& & -\sin(0) & = & 0 \\ & f^{(3)}(0) &=& & -\cos(0) & = & -1 \\ & f^{(4)}(0) &=& & \sin(0) & = & 0 \\ & f^{(5)}(0) &=& & \cos(0) & = & 1\end{split}\]

and you can see the pattern. The resulting series will look like:

\[\begin{split}\sin(x) = \;\; & \frac{0}{0!} x^0 + \frac{1}{1!} x^1 - \frac{0}{2!} x^2 - \frac{-1}{3!} x^3 + \frac{0}{4!}x^4 \frac{1}{5!}x^5 - \frac{0}{6!}x^6 - \frac{1}{7!}x^7 + \dots \\ = \;\; & x - \frac{1}{3!} x^3 + \frac{1}{5!} x^5 - \frac{1}{7!}x^7 \dots \\\end{split}\]

In class we do the same calculation for \(\cos(x)\) and spot the pattern in how the successive derivatives give zeros in the odd terms of the series, and make the + and - alternate. This gives us the even power alternating series for \(\cos(x)\).

The final thing I do here is to talk to the students about odd versus even functions. I first show some examples graphically - typically:

x^3 - 3 x
sin(x)

compared to:

x^2 - 4
cos(x)

With these plots up I discuss how they have different symmetries. Then I write down the definition of odd versus even:

\[\begin{split}\textrm{even function:} \;\; & f(-x) = -f(x) \\ \textrm{odd function:} \;\;\; & f(-x) = f(x)\end{split}\]

I then point out that it should not be surprising that since \(\sin(x)\) is odd, its Taylor series will have all the odd powers. The same goes with \(\cos(x)\) being even.