Using Richardson Extrapolation with Finite Differences

In [1]:
from math import sin, cos

Here are a function and its derivative. We also choose a "center" about which we carry out our experiments:

In [2]:
f = sin
df = cos

x = 2.3

We then compare the accuracy of:

  • First-order (right) differences
  • First-order (right) differences with half the step size
  • An estimate based on these two using Richardson extrapolation

against true, the actual derivative

In [9]:
for k in range(3, 10):
    h = 2**(-k)

    fd1 = (f(x+2*h) - f(x))/(2*h)
    fd2 = (f(x+h) - f(x))/h
    
    richardson = (-1)*fd1 + 2*fd2
    
    true = df(x)
    
    print("Err FD1: %g\tErr FD: %g\tErr Rich: %g" % (
            abs(true-fd1),
            abs(true-fd2),    
            abs(true-richardson)))
Err FD1: 0.08581	Err FD: 0.0448122	Err Rich: 0.00381441
Err FD1: 0.0448122	Err FD: 0.022862	Err Rich: 0.000911846
Err FD1: 0.022862	Err FD: 0.0115423	Err Rich: 0.000222501
Err FD1: 0.0115423	Err FD: 0.00579859	Err Rich: 5.49282e-05
Err FD1: 0.00579859	Err FD: 0.00290612	Err Rich: 1.3644e-05
Err FD1: 0.00290612	Err FD: 0.00145476	Err Rich: 3.39995e-06
Err FD1: 0.00145476	Err FD: 0.000727804	Err Rich: 8.48602e-07
In [ ]: