"Note that $x$ and $\\Delta x$ can be written interchangeably when $x_i=0$ as $\\Delta x=x-x_i$. If this would not be the case, for example if $x_i=\\pi$ the result is completely different.\n",
"\n",
"```\n",
"Compute the TSE polynomial truncated until $\\mathcal{O}(\\Delta x)^6$ for $f(x)=\\sin(x)$ around $x_i=\\pi$\n",
"Compute the TSE polynomial truncated until $\\mathcal{O}(\\Delta x)^6$ for $f(x)=\\sin(x)$ around $x_i=\\frac{\\pi}{2}$\n",
This series is **exact** as long as we include infinite terms. We, however, are limited to a **truncated** expression: an **approximation** of the real function.
Using only 3 terms and defining $\Delta x=x-x_i$, the TSE can be rewritten as
Here $\Delta x$ is the distance between the point we "know" and the desired point. $\mathcal{O}(\Delta x)^4$ means that we do not take into account the terms associated to $\Delta x^4$ and therefore **that is the truncation error order**. From here we can also conclude that **the larger the step $\Delta x$, the larger the error**!
```{tip}
We will use $\Delta x$ more frequently from this point on, so it is good to recognize now, using the equations above, that it is a different way of representing the differential increment, for example, $f(x_{i+1})=f(x_i+\Delta x)$ or $f(x_{i+2})=f(x_i+2\Delta x)$.
```
Now let's see an example.
---
:::{card} Example
Compute $e^{0.2}$ using 3 terms of TSE around $x_i=0$
```{admonition} Solution
:class: tip, dropdown
We want to evaluate $e^x=e^{x_i+\Delta x}=e^{0.2}$. Therefore, $\Delta x=0.2$. The value of $f(x_i=0)=e^0=1$. For the case of this exponential, the derivatives have the same value $f'(x_i=0)=f''(x_i=0)=f'''(x_i=0)=e^0=1$. The TSE looks like
\sin(x) \approx x-\frac{x^3}{6}+\frac{x^5}{120} \text{ with an error }\mathcal{O}(\Delta x)^6.
$$
Note that $x$ and $\Delta x$ can be written interchangeably when $x_i=0$ as $\Delta x=x-x_i$. If this would not be the case, for example if $x_i=\pi$ the result is completely different.
```
Compute the TSE polynomial truncated until $\mathcal{O}(\Delta x)^6$ for $f(x)=\sin(x)$ around $x_i=\pi$
Compute the TSE polynomial truncated until $\mathcal{O}(\Delta x)^6$ for $f(x)=\sin(x)$ around $x_i=\frac{\pi}{2}$
The second derivative terms cancel each other out, therefore **the order error is the step size squared!** From here, it is obvious that the central difference is more accurate. You can notice it as well intuitively in the figure of the previous chapter.
%% Cell type:markdown id: tags:
:::{card} Exercises
Derive the forward difference for the first derivative $f'(x_i)$ with a first order error $\mathcal{O}(\Delta x)$ using Taylor series
```{admonition} Solution
:class: tip, dropdown
Start from the TSE
(tip: first derivative and first order error (1+1=2), we need to do a TSE truncated until the 2nd order):
$$
f(x_i+\Delta x) = f(x_i) + \Delta x f'(x_i)+ \mathcal{O}(\Delta x)^2
$$
Rearange to get:
$$
- \Delta x f'(x_i)= -f(x_i+\Delta x)+ f(x_i)+\mathcal{O}(\Delta x)^2
$$
Divide by $ - \Delta x $:
$$
f'(x_i)= \frac{f(x_i+\Delta x)- f(x_i)}{\Delta x } + \mathcal{O}(\Delta x)
$$
```
Use taylor series to derive the backward difference for the first derivative $f'(x_i)$ with a first order error $\mathcal{O}(\Delta x)$
```{admonition} Solution
:class: tip, dropdown
Start from the TSE
(tip: first derivative and first order error, 1+1=2, we need to do a TSE truncated until the 2nd order):
$$f(x_i-\Delta x) = f(x_i) - \Delta x f'(x_i)+ \mathcal{O}(\Delta x)^2$$
Rearrange to get:
$$ \Delta x f'(x_i)= -f(x_i-\Delta x)+ f(x_i)+\mathcal{O}(\Delta x)^2 $$
There are equations that require second derivatives. One example is the diffusion equation. The 1-D diffusion equation reads:
$$
\frac{\partial f}{\partial t}=v\frac{\partial^2 f}{\partial x^2} \text{ where } v \text{ is the diffusion coefficient.}
$$
For the moment we will use TSE to find **only** a numerical expression of the second derivative $\frac{\partial^2 f}{\partial x^2}$.
The procedure is simple but cumbersome. The general idea is to isolate the second derivative in the TSE without there being a dependency on other derivatives. Below you can find more details about the algebraic manipulation (if you are curious) but you do not need to know it. Here is the result:
This is the **forward** difference approximation of the second derivative. Two aspects come to mind: one more point is needed to compute the second derivative and the error (of the simplest second derivative) is also of the order of the step. There are also **backward** and **central** approximations of the second derivative (not shown here).
%% Cell type:markdown id: tags:
**Derivation of the forward difference approximation of the second derivative**
First define a TSE for a point two steps farther from $x_i$:
So far we have found expressions with a relative large magnitude error $\mathcal{O}(\Delta x)$ with the exception of the central difference approximation. Sometimes a higher accuracy is desired for which better expressions can be found. The procedure is similar to the algebraic manipulation to find the **forward** approximation of the second derivative: a series of TSE are defined at varying distances from $x_i$ and after algebraic manipulation a more accurate expression is found. For example, the **forward** approximation of the first derivative:
The error magnitude has improved to $\mathcal{O}(\Delta x^2)$ at the expense of using one more point. The accuracy can be even better by using more points. It is important to note that central differences are more accurate than forward and backward differences when using the same number of points.
%% Cell type:markdown id: tags:
:::{card} Exercise
Derive a first derivative $f('x)$ with an 2nd error order $\mathcal{O}(\Delta x^2)$ finite-difference equations, using the following nodes $x_i$, $x_i+\Delta x$ and $x_i+2\Delta x$ or $x_i$, $x_{i+1} x$ and $x_{i+2}$
The finite-difference equations will have the following form:
Use the Taylor series to find $\alpha$, $\beta$ and $\gamma$
```{admonition} Solution
:class: tip, dropdown
Taylor series for $f(x_i+\Delta x)$ and $f(x_i+2\Delta x)$ (Tip: first order derivative and second error order, 1+2=3, meaning a truncation until 3rd order ):