Polynomial Approximation with Derivatives

In [7]:
import numpy as np
import matplotlib.pyplot as plt

from math import factorial

A Brief Intro to sympy

Here we import sympy, a package for symbolic computation with Python.

In [8]:
import sympy as sp
sp.init_printing()

Next, we make a (symbolic) variable $x$ from which we can then build more complicated expressions:

In [9]:
x = sp.Symbol("x")
x
Out[9]:
$$x$$

Build up an expression with $x$. Assign it to expr. Observe that this expression isn't evaluated--the result of this computation is some Python data that represents the computation:

In [10]:
expr = sp.sin(sp.sqrt(x)+2)**2
expr
Out[10]:
$$\sin^{2}{\left (\sqrt{x} + 2 \right )}$$

Next, take a derivative, using .diff(x).

In [11]:
expr.diff(x)
Out[11]:
$$\frac{1}{\sqrt{x}} \sin{\left (\sqrt{x} + 2 \right )} \cos{\left (\sqrt{x} + 2 \right )}$$

Take 10 derivatives:

In [12]:
expr.diff(x, 10)
Out[12]:
$$\frac{1}{512} \left(- \frac{256}{x^{5}} \sin^{2}{\left (\sqrt{x} + 2 \right )} + \frac{256}{x^{5}} \cos^{2}{\left (\sqrt{x} + 2 \right )} + \frac{63360}{x^{6}} \sin^{2}{\left (\sqrt{x} + 2 \right )} - \frac{63360}{x^{6}} \cos^{2}{\left (\sqrt{x} + 2 \right )} - \frac{2162160}{x^{7}} \sin^{2}{\left (\sqrt{x} + 2 \right )} + \frac{2162160}{x^{7}} \cos^{2}{\left (\sqrt{x} + 2 \right )} + \frac{18918900}{x^{8}} \sin^{2}{\left (\sqrt{x} + 2 \right )} - \frac{18918900}{x^{8}} \cos^{2}{\left (\sqrt{x} + 2 \right )} - \frac{34459425}{x^{9}} \sin^{2}{\left (\sqrt{x} + 2 \right )} + \frac{34459425}{x^{9}} \cos^{2}{\left (\sqrt{x} + 2 \right )} - \frac{11520}{x^{\frac{11}{2}}} \sin{\left (\sqrt{x} + 2 \right )} \cos{\left (\sqrt{x} + 2 \right )} + \frac{887040}{x^{\frac{13}{2}}} \sin{\left (\sqrt{x} + 2 \right )} \cos{\left (\sqrt{x} + 2 \right )} - \frac{15135120}{x^{\frac{15}{2}}} \sin{\left (\sqrt{x} + 2 \right )} \cos{\left (\sqrt{x} + 2 \right )} + \frac{64864800}{x^{\frac{17}{2}}} \sin{\left (\sqrt{x} + 2 \right )} \cos{\left (\sqrt{x} + 2 \right )} - \frac{34459425}{x^{\frac{19}{2}}} \sin{\left (\sqrt{x} + 2 \right )} \cos{\left (\sqrt{x} + 2 \right )}\right)$$

Use .subs(x, ...) and .evalf() to evaluate your expression for $x=1$.

In [13]:
expr.subs(x, 1).evalf()
Out[13]:
$$0.019914856674817$$

Polynomial Approximation

Here are some functions to play with:

In [17]:
#f = sp.exp(x)
f = sp.sqrt(1-x**2)
#f = 1/(20*x-10)

Plot f:

In [18]:
def plot_sympy(my_f, my_pts, **kwargs):
    f_values = np.array([my_f.subs(x, pt) for pt in my_pts])
    plt.plot(pts, f_values, **kwargs)
    plt.ylim([-1.3, 1.3])
In [19]:
pts = np.linspace(-1, 1, 1000)
plot_sympy(f, pts)

The Taylor polynomial uses derivatives to automatically get close to the function--the closer the higher you increase the degree.

Write out the degree 4 Taylor polynomial about 0, call it taylor4:

In [21]:
taylor4 = (
    f.subs(x, 0)
    + f.diff(x).subs(x, 0) * x
    + f.diff(x, 2).subs(x, 0)/2 * x**2
    + f.diff(x, 3).subs(x, 0)/6 * x**3
    + f.diff(x, 4).subs(x, 0)/24 * x**4
)
taylor4
Out[21]:
$$- \frac{x^{4}}{8} - \frac{x^{2}}{2} + 1$$
In [22]:
plot_sympy(taylor4, pts, label="taylor4")
plot_sympy(f, pts, label="f")
plt.legend(loc="best")
Out[22]:
<matplotlib.legend.Legend at 0x7f88ff238e10>

Now write code to do this for any order n. Put the result in taylor:

In [23]:
n = 20
In [26]:
taylor = 0
for i in range(n):
    taylor += f.diff(x, i).subs(x, 0)/factorial(i) * x**i
In [25]:
taylor
Out[25]:
$$- \frac{715 x^{18}}{65536} - \frac{429 x^{16}}{32768} - \frac{33 x^{14}}{2048} - \frac{21 x^{12}}{1024} - \frac{7 x^{10}}{256} - \frac{5 x^{8}}{128} - \frac{x^{6}}{16} - \frac{x^{4}}{8} - \frac{x^{2}}{2} + 1$$
In [158]:
plot_sympy(taylor, pts, label="taylor")
plot_sympy(f, pts, label="f")
plt.legend(loc="best")
Out[158]:
<matplotlib.legend.Legend at 0x7f7cf1d64198>

Behavior of the Error (Part II)

Let's repeat the setup, for clarity:

In [166]:
f = 1/(20*x-10)
In [167]:
n = 4

taylor = 0
for i in range(n):
    taylor += f.diff(x, i).subs(x, 0)/factorial(i) * x**i
    
error = taylor - f
In [174]:
plot_sympy(error, pts, label="error")
plt.legend(loc="best")
Out[174]:
<matplotlib.legend.Legend at 0x7f7cf1ba3f98>
  • What's the error at the expansion center?
  • How does the error depend on the distance to the expansion center?

Let's look back at this Taylor expansion:

In [169]:
taylor
Out[169]:
$$- \frac{4 x^{3}}{5} - \frac{2 x^{2}}{5} - \frac{x}{5} - \frac{1}{10}$$
  • To get an idea of what the error might look like, compute the Taylor polynomial up, store it in next_taylor.
  • Compare that to Taylor.
In [190]:
next_taylor = 0
for i in range(n+1):
    next_taylor += f.diff(x, i).subs(x, 0)/factorial(i) * x**i
    
taylor-next_taylor
Out[190]:
$$\frac{8 x^{4}}{5}$$

How does that difference compare to the actual error?

In [175]:
plot_sympy(taylor-next_taylor, pts, label="diff to next")
plot_sympy(error, pts, label="error")
plt.legend(loc="best")
Out[175]:
<matplotlib.legend.Legend at 0x7f7cf00b54e0>
  • Far away from the expansion center, it doesn't look like they have much to do with each other.
  • To get a better idea of what happens close to the center, use a log-log plot:
In [189]:
pos_pts = 10**np.linspace(-3, 0.5)
err_values = [abs(error.subs(x, pt)) for pt in pos_pts]
plt.loglog(pos_pts, err_values)
plt.grid()
plt.xlabel("$x$")
plt.ylabel("Error")
Out[189]:
<matplotlib.text.Text at 0x7f7ceb075e80>
  • What is the behavior of the error as we get very close to the expansion center?
  • Is there any predictive power in what we just observed?

Predicting Taylor Error (Part III)

In [46]:
f = sp.sqrt(x-10)
t = sum([f.diff(x, i).subs(x, 12)/factorial(i) * (x-12)**i for i in range(4)])
t
Out[46]:
$$\frac{\sqrt{2}}{128} \left(x - 12\right)^{3} - \frac{\sqrt{2}}{32} \left(x - 12\right)^{2} + \frac{\sqrt{2}}{4} \left(x - 12\right) + \sqrt{2}$$
In [47]:
f.subs(x, 12.5)
Out[47]:
$$1.58113883008419$$
In [48]:
t.subs(x, 12.5).evalf()
Out[48]:
$$1.5813227821457$$
In [59]:
error1 = f.subs(x, 12.5) - t.subs(x, 12.5).evalf()
abs(error1)
Out[59]:
$$0.000183952061507453$$

Now predict the error at $12.25$:

In [60]:
abs(error1) * (0.25/0.5)**4
Out[60]:
$$1.14970038442158 \cdot 10^{-5}$$
In [61]:
error2 = f.subs(x, 12.25) - t.subs(x, 12.25).evalf()
abs(error2)
Out[61]:
$$1.24076489040892 \cdot 10^{-5}$$
In [ ]: