Newton’s Divided-difference and Lagrange interpolating polynomials Python implementation

Daniel
Analytics Vidhya
Published in
5 min readMay 29, 2020

This blog was inspired when I’m studying for my ENGG 414 (Numerical Methods) exam.

Maybe you are doing an experiment or you have datasets that want to explore, but upon reviewing your datasets you noticed that it missing some datapoints that might be important for whatever you are doing. So, what should you do? Should you panic? Absolutely no, because Newton and Lagrange got your back.

All you need to do is to interpolate.

Interpolation is a process of estimating intermediate values between precise data points. The most common method used for this purpose is polynomial interpolation.

Using interpolation, you can now estimate datapoints that might be missing from your data. But how exactly are you going to do that?

Newton’s Divided-difference and Lagrange interpolating polynomials provide a simple and easy algorithm that can be implemented in a computer.

Newton’s Divided-difference interpolating polynomial

Newton polynomial can be written as

https://en.wikipedia.org/wiki/Newton_polynomial

enough of the crazy part let’s go to implementation!

Note: The implementation was done in Python 3 so we could also visualize some graphs using python libraries like matplotlib, pandas and numpy. Jupyter notebook was used to write these codes.

import pandas as pd
import matplotlib.pyplot as plt
import numpy as np

This is the algorithm for Newton’s polynomial. It accepts an array of x and y values, and the value (xi) you want to interpolate. This function returned a multidimensional array of the interpolated values for different orders and their respective relative errors that would be used for visualization. (But you can modify the code to return an array of interpolated values in different orders if you don’t need to visualize)

def newton_interpolation(x, y, xi):
#length/number of datapoints
n = len(x)
#divided difference initialization
fdd = [[None for x in range(n)] for x in range(n)]
#f(X) values at different degrees
yint = [None for x in range(n)]
#error value
ea = [None for x in range(n)]

#finding divided difference
for i in range(n):
fdd[i][0] = y[i]
for j in range(1,n):
for i in range(n-j):
fdd[i][j] = (fdd[i+1][j-1] - fdd[i][j-1])/(x[i+j]-x[i])

#just printing dd here
fdd_table = pd.DataFrame(fdd)
print(fdd_table)

#interpolating xi
xterm = 1
yint[0] = fdd[0][0]
for order in range(1, n):
xterm = xterm * (xi - x[order-1])
yint2 = yint[order-1] + fdd[0][order]*xterm
ea[order-1] = yint2 - yint[order-1]
yint[order] = yint2

#returning a map for pandas dataframe
return map(lambda yy, ee : [yy, ee], yint, ea)

For our sample problem, we will try to estimate the value of ln(2) using the given datapoints:

x = [1,4,6, 5, 3, 1.5, 2.5, 3.5]
y = [0, 1.3862944, 1.7917595, 1.6094379, 1.0986123 , 0.4054641, 0.9162907, 1.2527630]

We can use our newton_interpolation function to solve for ln(2)

a = newton_interpolation(x, y, 2)
df = pd.DataFrame(a, columns=['f(x)','error'])

and it will show the following data frame:

Divided difference dataframe
Value of f(x) using different polynomial orders and their relative errors

And now we have estimated the value of ln(2) using our eight datapoints. The highest order will yield the nearest approximation which is ln(2) = 0.69348.

We can visualize these polynomial functions of different order and compare it to the actual function of f(x) = ln(x).

Estimated value of ln(2) using Newton’s interpolating polynomial

It’s amusing, isn’t it?

Now let’s hear more about Lagrange and his approach to the interpolation world.

Lagrange interpolating polynomial

Given a set of k + 1 data points

where no two xj are the same, the interpolation polynomial in the Lagrange form is a linear combination

of Lagrange basis polynomials

Geez, what was that? Okay, enough of foreplay let’s get to the more exciting part.

Below is the implementation of that crazy math expression of Lagrange polynomial. The function takes an array of x and y values and the value you want to interpolate, xx. Compared to the Newton’s polynomial function, this is set up to compute a single nth-order prediction, where n + 1 is the number of data points.

def lagrange_interpolation(x,y,xx):
n = len(x)
sum = 0
for i in range(n):
product = y[i]
for j in range(n):
if i != j:
product = product*(xx - x[j])/(x[i]-x[j])
sum = sum + product
return sum

Using the same datapoints from the latter approach, we could also solve for the estimated value of ln(2) using Lagrange polynomial and graph each value of different order.

plt.scatter(x,y)
xx = np.linspace(0, 8, 100)
yy = [lagrange_interpolation(x,y,i) for i in xx]
xo= np.linspace(0.01,8,200)
plt.plot(xo, np.log(xo), label="f(x)=lnx", color="black")
plt.grid(True, which='both')
plt.axvline(x=0, color='k')
plt.axhline(y=0, color='k')
plt.plot(xx,yy)
fig = plt.figure()for i in range(1,len(x)):
g1 = fig.add_subplot(len(x)/2,2,i)
g1.plot(xx, lagrange_interpolation(x[i-1:],y[i-1:],xx))
g1.scatter(x,y)
plt.grid(True, which='both')
plt.axvline(x=0, color='k')
plt.axhline(y=0, color='k')

The above code will give us these beautiful graphs.

seventh order interpolation graph
Graph of different orders of interpolation using Lagrange

I was on the edge of my seat seeing these graphs!

Now,

Which of the two should you use?

For that question, let me quote Mark from math.stackexchange.com

If you want to have an easy formula for the remainder of the interpolation then it is much better to work with Newton’s method. Another advantage is that if you found the interpolation polynomial in the points x0, x1,…,xn and then you want to add the point xn+1 then using Newton’s method you can use the polynomial in the points x0,…,xn to easily find the polynomial for the points x0,…,xn+1. You don’t have to find all the coefficients all over again.

On the other hand, if you want to do Numerical differentiation or Numerical integration then it is much easier to work with Lagrange’s method. So having different ways to calculate the interpolation polynomial gives a lot more options. It all depends on what exactly you need to do.

In terms of the algorithm complexity, both algorithms run on O(n²) time but Newton’s method need extra O(n²) for space.

Have fun interpolating!

References

Chapra, S. C., & Canale, R. P. (2015). Numerical Methods for Engineers (7th ed.). New York: McGraw-Hill Education.

Lagrange polynomial. (2020, May). Retrieved from Wikipedia: https://en.wikipedia.org/wiki/Lagrange_polynomial

Mathematics. (2018). Retrieved from StackExchange: https://math.stackexchange.com/questions/2884888/what-is-difference-between-newton-interpolation-and-lagrange-interpolation

Newton polynomial. (2020, May). Retrieved from Wikipedia: https://en.wikipedia.org/wiki/Newton_polynomial

--

--