Sõnastik

Valige vasakul üks märksõnadest ...

Multivariable CalculusTaylor Series

Lugemise aeg: ~20 min

We can define a polynomial which approximates a smooth function in the vicinity of a point with the following idea: match as many derivatives as possible.

The utility of this simple idea emerges from the convenient simplicity of polynomials and the fact that a wide class of functions look pretty much like polynomials when you zoom in around a given point.

First, a bit of review on the exponential function x\mapsto \exp(x): we define \exp to be the function which maps 0 to 1 and which is everywhere equal to its own derivative. It follows (nontrivially) from this definition that \exp(x) = \exp(1)^x, so may define \mathrm{e} = \exp(1) and write the exponential function as x\mapsto \mathrm{e}^x. The value of \mathrm{e} is approximately 2.718.

Example
Find the quadratic polynomial P_2 whose zeroth, first, and second derivatives at the origin match those of the exponential function.

Solution. Since P_2 is quadratic, we must have

\begin{align*}P_2(x) = a_0 + a_1x + a_2x^2\end{align*}

for some a_0, a_1, and a_2. To match the derivative, we check that P_2(0) = a_0 and f(0) = 1. So we must have a_0 =1. Similarly, P_2'(0) = a_1, so if we want P_2'(0) = f'(0) = 1, have to choose a_1 = 1 as well.

For a_2, we calculate P_2''(x) = (a_1 + 2a_2x)' = 2a_2, so to get P_2''(0) = f''(0) = 1, we have to let a_2 = \tfrac{1}{2}. So

\begin{align*}P_2(x) = 1 + x + \tfrac{1}{2}x^2\end{align*}

is the best we can do. Looking at the figure, we set that P_2 does indeed do a better job of 'hugging' the graph of f near x=0 than the best linear approximation (L(x) = 1 + x) does.

The best constant, linear, and quadratic approximations of XEQUATIONX4191XEQUATIONX near the origin

We can extend this idea to higher order polynomials, and we can even include terms for all powers of x, thereby obtaining an infinite series:

Definition (Taylor Series)
The Taylor series, centered at c, of an infinitely differentiable function f is defined to be

\begin{align*}f(c) + f'(c)(x-c) + \frac{f''(c)}{2!}(x-c)^2 + \frac{f'''(c)}{3!}(x-c)^3 + \cdots\end{align*}

Example
Find the Taylor series centered at the origin for the exponential function.

Solution. We continue the pattern we discovered for the quadratic approximation of the exponential function at the origin: the $n$th derivative of a_0 + a_1x + \cdots + a_n x^n + \cdots is n!a_n, while the $n$th derivative of the exponential function is 1 at the origin. Therefore, a_n = 1/n!, and we obtain the Taylor series

\begin{align*}1 + x + \frac{x^2}{2!} + \frac{x^3}{3!} + \cdots\end{align*}

It turns out that this series does in fact converge to \mathrm{e}^x, for all x \in \mathbb{R}.

Taylor series properties

It turns out that if the Taylor series for a function converges, then it does so in an interval centered around c. Furthermore, inside the interval of convergence, it is valid to perform term-by-term operations with the Taylor series as though it were a polynomial:

  • We can multiply or add Taylor series term-by-term.
  • We can integrate or differentiate a Taylor series term-by-term.
  • We can substitute one Taylor series into another to obtain a Taylor series for the composition.

Theorem
All the operations described above may be applied wherever all the series in question are convergent. In other words, f and g have Taylor series P and Q converging to f and g in some open interval, then the Taylor series for fg, f+g, f', and \int f converge in that interval and are given by PQ, P+Q, P', and \int P, respectively. If P has an infinite radius of convergence, then the Taylor series for f\circ g is given by P\circ Q.

The following example shows how convenient this theorem can be for finding Taylor series.

Example
Find the Taylor series for f(x) = \cos x + x \mathrm{e}^{x^2} centered at c = 0.

Solution. Taking many derivatives is going to be no fun, especially with that second term. What we can do, however, is just substitute x^2 into the Taylor series for the exponential function, multiply that by x, and add the Taylor series for cosine:

\begin{align*}&\left(1 - \frac{x^2}{2!} + \frac{x^4}{4!} - \cdots\right) + x\left(1 + x^2 + \frac{(x^2)^2}{2!} + \frac{(x^2)^3}{3!} + \cdots\right) \\\ &= 1 + x - \frac{x^2}{2!} + x^3 + \frac{x^4}{4!} + \frac{x^5}{2!} + \cdots.\end{align*}

In summation notation, we could write this series as \sum_{n=0}^\infty a_n x^n where a_n is equal to (-1)^{n/2}/n! if n is even and 1/((n-1)/2)! if n is odd.

Exercise
Find the Taylor series for 1/(1-x) centered at the origin, and show that it converges to 1/(1-x) for all -1 < x < 1.

Use your result to find x + 2x^2 + 3x^3 + 4x^4 + \cdots. Hint: think about differentiation.

Solution. Calculating derivatives of 1/(1-x), we find that the Taylor series centered at the origin is 1 + x + x^2 + \cdots. Furthermore, we know that

\begin{align*}\frac{1}{1-x} = 1 + x + x^2 + x^3 + \cdots,\end{align*}

for -1 < x < 1, by the formula for infinite geometric series.

We can use this result to find \sum_{k = 1}^\infty k x^k by differentiating both sides and multiplying both sides by x:

\begin{align*}\frac{1}{(1-x)^2} = 1 + 2x + 3x^2 + 4x^3 + \cdots\end{align*}

We get

\begin{align*}\frac{x}{(1-x)^2} = x + 2x^2 + 3x^3 + 4x^4 + \cdots\end{align*}

Exercise
Show that \lim_{n\to\infty}(1+x/n)^n is equal to \mathrm{e}^x by showing that \lim_{n\to\infty}\log (1+x/n)^n = x.

Solution. Integrating the equation

\begin{align*}\frac{1}{1+x} = 1 - x + x^2 - x^3 + x^4 - \cdots\end{align*}

term by term, we find that

\begin{align*}\log(1+x) = x - \frac{x^2}{2} + \frac{x^3}{3} - \frac{x^4}{4} + \cdots\end{align*}

Substituting gives

\begin{align*}n \log (1+x/n) = x - \frac{x}{2n} + \frac{x^3}{3n^2} - \cdots.\end{align*}

Each of the terms other than the converges to 0, and we can take limits term-by-term since x/n is inside of the interval of convergence for this series. Therefore, \lim_{n\to\infty}\log(1+x/n)^n = x, and since the exponential function is continuous, this implies that \lim_{n\to\infty}(1+x/n)^n = \mathrm{e}^x

Bruno
Bruno Bruno