Using Richardson Extrapolation with Finite Differences

In [1]:
from math import sin, cos

Here are a function and its derivative. We also choose a "center" about which we carry out our experiments:

In [2]:
f = sin
df = cos

x = 2.3

We then compare the accuracy of:

  • First-order (right) differences
  • First-order (right) differences with half the step size
  • An estimate based on these two using Richardson extrapolation

against true, the actual derivative

In [3]:
for k in range(3, 10):
    h = 2**(-k)

    h1 = 2*h
    fd1 = (f(x+h1) - f(x))/(h1)
    
    h2 = h
    fd2 = (f(x+h2) - f(x))/h2
    
    p = 1
    alpha = - h2**p / (h1**p - h2**p)
    beta = 1 - alpha
    richardson = alpha*fd1 + beta*fd2
    
    true = df(x)
    
    print("Err FD1: %g\tErr FD: %g\tErr Rich: %g" % (
            abs(true-fd1),
            abs(true-fd2),    
            abs(true-richardson)))
Err FD1: 0.08581	Err FD: 0.0448122	Err Rich: 0.00381441
Err FD1: 0.0448122	Err FD: 0.022862	Err Rich: 0.000911846
Err FD1: 0.022862	Err FD: 0.0115423	Err Rich: 0.000222501
Err FD1: 0.0115423	Err FD: 0.00579859	Err Rich: 5.49282e-05
Err FD1: 0.00579859	Err FD: 0.00290612	Err Rich: 1.3644e-05
Err FD1: 0.00290612	Err FD: 0.00145476	Err Rich: 3.39995e-06
Err FD1: 0.00145476	Err FD: 0.000727804	Err Rich: 8.48602e-07
In [4]:
3.39995e-06 / 8.48602e-07
Out[4]:
4.006530741148383