Monthly Archives: July 2017

Numerical approximation of derivatives

Suppose we are numerically approximating the second-order derivative \frac{d^2y}{dx^2}

A common way is to use finite difference scheme which uses Taylor expansion.
1. Perform Taylor expansion for y_{i+1}, y_{i}, y_{i-1} .etc
2. Assemble the Taylor series and cancel the high-order terms to construct the desired-order scheme, i.e. finding a set of coefficients which makes the summation of high-order terms to be zero.

For example, the second-order central difference scheme:
\frac{d^2y}{dx^2} \approx \frac{y_{i+1}-y_{i-1}}{\Delta x^2}

An alternative way to approximate the derivative.
1. Express y as y=a+bx+cx^2
We can see c=\frac{d^2y}{dx^2}, so the objective is to find the coefficient c.

2. Use the current data to fit this second-order polynomial (a parabola, actually a curve fitting problem)
y_1 = a+bx_1+cx_1^2
y_2 = a+bx_2+cx_2^2
y_3 = a+bx_3+cx_3^2
...
y_n = a+bx_n+cx_n^2

3. Solve this fitting problem
Since we have three unknown coefficients a, b, and c, we only need three data points. However, we can choose more data points to fit this curve, which is actually least square fitting. Now, let us write the above equations in a general vector form: Ax=B
where

    \[ A= \begin{bmatrix} 1 & x_{1} & x_{1}^2 & \\ 1 & x_{2} & x_{2}^2 & \\ 1 & x_{3} & x_{3}^2 & \\ \hdotsfor{3} \\ 1 & x_{n} & x_{n}^2 & \\ \end{bmatrix} , \quad x= \begin{bmatrix} a\\ b\\ c\\ \end{bmatrix} , \quad \textrm{and} \quad B= \begin{bmatrix} y_1 \\ y_2 \\ y_3 \\ \dots \\ y_n\\ \end{bmatrix} . \]

This is an over-determined system if n>3, which we can approximate it with least square fitting.
See https://en.wikipedia.org/wiki/Overdetermined_system

4. Solution
The approximated/fitted solution is: x=(A^TA)^{-1}A^TB, where the second-order derivative \frac{d^2y}{dx^2}=c=x[3]