Other languages:

# Taylor's Theorem

For an $N$ times differentiable function $f(x)$ and for $x$ and $a$ in an open interval on the real line, there exists $c$ between $x$ and $a$ we have

$f(x)=\sum _{n=0}^{N-1}{\frac {f^{(n)}(a)}{n!}}(x-a)^{n}+R_{N}$ where the remainder is given by $R_{N}={\frac {f^{(N)}(c)}{N!}}(x-a)^{N}.$ ## Applications of Taylor's theorem

The practical application of Taylor's theorem is to provide a ready alternate representation of a function by expanding that function about a given point. The number of terms of the Taylor series expansion reflects the number of continuous derivatives that the function being expanded has at the point that it is being expanded about.

Taylor's theorem forms the foundation of a number of numerical computation schemes, including the approximation of smooth functions, the formulation of finite-difference methods, and the formulation of optimization algorithms.

The complex form of the Taylor's series for a complex valued function of a complex valued function converges if and only if the function is analytic within a neighborhood of the point about, the function is being expanded.

# Taylor series as an infinite series in 1D

A real valued function $f(x)$ can be expressed in terms of the value of the function and its derivatives at any point $x=b$ . In one variable this is

$f(x)=f(b)+{\frac {1}{1!}}{\frac {df(b)}{dx}}(x-b)+{\frac {1}{2!}}{\frac {d^{2}f(b)}{dx^{2}}}(x-b)^{2}+{\frac {1}{3!}}{\frac {d^{3}f(b)}{dx^{3}}}(x-b)^{3}+...$ where ! denotes factorial (e.g., $3!=3\times 2\times 1=6$ ).

By the ratio test of convergence, the Taylor series converges for values of $x$ where the ratio of the $n+1$ -th and the $n$ -th terms is less than 1. That is:

$\left|{\frac {\frac {f^{(n+1)}(b)}(n+1)!}}(x-b)^{n+1}}{\frac {f^{(n)}(b)}{n!}}(x-b)^{n}}}\right|=\left|{\frac {f^{(n+1)}(b)}{f^{(n)}(b)(n+1)}}(x-b)\right|<1$ The Maclaurin series is the special case where b=0.

## Proof of the 1D Taylor Theorem

### N=1 Case

For $N=1$ we consider the

$f(x)=f(a)+R_{1}(x).$ We define a new function $r_{1}(w)=f(x)-f(w)$ such that $r_{1}(a)=R_{1}(x)$ .

Differentiating with respect to $w$ we obtain

$r_{1}^{\prime }(w)=-f^{\prime }(w)$ Now, we construct the function $S(w)$ such that $S_{1}(a)=0$ and $S_{1}(x)=0$ $S_{1}(w)=r_{1}(w)-\left({\frac {x-w}{x-a}}\right)r_{1}(a)$ Because $S_{1}(w)$ is not constant, by Rolle's theorem $S_{1}(w)$ must have an extremum at some value $c$ between $a$ and $x$ , so $S_{1}^{\prime }(c)=0$ and

$r_{1}^{\prime }(c)+{\frac {r_{1}(a)}{(x-a)}}=0$ solving for the remainder $r_{1}(a)=R_{1}(x)=f^{\prime }(c)(x-a)$ , yielding

$f(x)=f(a)+f^{\prime }(c)(x-a).$ ### Arbitary N Case

This same method generalizes to the case of arbitrary $N$ ,

$f(x)=\sum _{n=0}^{N-1}{\frac {f^{(n)}(a)}{n!}}(x-a)^{n}+R_{N}(x)$ We define a new function $r_{N}(w)$ such that $r_{N}(a)=R_{N}(x)$ as

$r_{N}(w)=f(x)-\sum _{n=0}^{N-1}{\frac {f^{(n)}(w)}{n!}}(x-w)^{n}.$ As above, we differentiate $r_{N}(w)$ with respect to $w$ to yield

$r_{N}^{\prime }(w)=-{\frac {f^{(N)}(w)}{(N-1)!}}(x-w)^{N-1}$ .

As in the $N=1$ case, we construct the function $S_{N}(w)$ such that $S_{N}(a)=0$ and $S_{N}(a)=0$ $S_{N}(w)=r_{N}(w)-\left({\frac {x-w}{x-a}}\right)^{N}r_{N}(a)$ .

Because $S_{N}(w)$ is not constant, it must have an extremum at a point $c$ somewhere between $x$ and $a$ $S_{N}^{\prime }(c)=0=r_{N}^{\prime }(c)-N{\frac {(x-w)^{N-1}}{(x-a)^{N}}}r_{N}(a)$ .

Solving for $r_{N}(a)=R_{n}(x)$ completes the proof

$f(x)=\sum _{n=0}^{N-1}{\frac {f^{(n)}(a)}{n!}}(x-a)^{n}+{\frac {f^{(N)}(c)}{N!}}(x-a)^{N}.$ ## Proof of the infinite series form of Taylor's theorem in 1D

By the fundamental theorem of calculus we note that

$\int _{b}^{x}f^{\prime }(y)\;dy=f(x)-f(b).$ implying that

$f(x)=f(b)+\int _{b}^{x}f^{\prime }(y)\;dy.$ Our method of constructing the Taylor's series will be by repetitive integration by parts of the remainder term, where we introduce $(x-y)^{0}$ $f(x)=f(b)+\int _{b}^{x}f^{\prime }(y)(x-y)^{0}\;dy.$ We integrate the integral remainder term by parts, integrating the $(x-y)$ term and differentiating the $f^{\prime }(y)$ term we have

$\int _{b}^{x}f^{\prime }(y)(x-y)^{0}\;dy=f^{\prime }(b)(x-b)+\int _{b}^{x}f^{\prime \prime }(y)(x-y)\;dy.$ This process may be applied repetitively to yield the familiar form of the Taylor expansion. Suppose this has been done $N$ times, then the remainder term

$\left|{\frac {1}{N!}}\int _{b}^{x}f^{(N+1)}(y)(x-y)^{N+1}\;dy\right|\leq {\frac {1}{N!}}\max |f^{(N+1)}(y)|\left|\int _{b}^{x}(x-y)^{N+1}\;dy\right|\leq {\mbox{const.}}\left|{\frac {(x-b)^{N+2}}{(N+2)!}}\right|\rightarrow 0\qquad {\mbox{as}}\qquad N\rightarrow \infty .$ # Taylor's series of an analytic function

Given a function $f(z)$ analytic inside some region ${\mathcal {R}}$ of the complex $z$ plane, then we may write

$f(z)=\sum _{n=0}^{\infty }{\frac {f^{(n)}(a)}{n!}}(z-a)^{n}.$ where the series converges in a disc, centered at the value $z=a$ , contained within ${\mathcal {R}}$ .