Skip to content
Snippets Groups Projects
Commit 625d3bf8 authored by Isabel Slingerland's avatar Isabel Slingerland
Browse files

Merge branch 'week_5_isabel' of gitlab.tudelft.nl:mude/book into week_5_isabel

parents d8af510e 273a6faf
No related branches found
No related tags found
No related merge requests found
%% Cell type:markdown id: tags:
## Revision of Concepts
# Revision of Concepts
%% Cell type:markdown id: tags:
### Classification of Differential Equations
## Classification of Differential Equations
%% Cell type:markdown id: tags:
```{note}
**Important things to retain from this block:**
* Identify characteristics of differential equations
* Understanding how numerical and analytical solutions might differ as we get to more complex problems which need simplification
* Remember that analytical solutions are not always possible
```
%% Cell type:markdown id: tags:
Differential equations can be classified in Ordinary Differential Equations (ODEs) and Partial Differential Equations (PDEs). **ODEs** have derivatives with respect to a **single** independent variable (either time **or** space), for example
$$
\frac{d x(t)}{d t} = \cos t
$$
describes the change rate of the variable $x$ in **time $t$**.
**PDEs** have derivatives with respect to **multiple** independent variables (often time **and** space), for example:
$$
\frac{\partial c(x,t)}{\partial t} + u\frac{\partial c(x,t)}{\partial x} = 0
$$
describes the propagation of the concentration $c$ in **time $t$** along **dimension $x$**.
The classification can be done more precisely, by its order and linearity (linear or non-linear). The order refers to the highest derivative while non-linear equations are those in which the dependent variable or its derivative(s) are non linear (exponential, sine or with a power different than 1). See the following examples:
$$
\frac{d^3y}{dx^3} - x\frac{d^2y}{dx^2} + y = 0 \qquad \text{third-order linear ODE}
$$
$$
\frac{dy}{dx} = y^2 + x \qquad \text{first-order non-linear ODE}
$$
$$
\frac{d^2y}{dx^2} + y\left(\frac{dy}{dx}\right)^2 = \sin(y) \qquad \text{second-order non-linear ODE}
$$
%% Cell type:markdown id: tags:
### Analytical vs Numerical Solutions
## Analytical vs Numerical Solutions
%% Cell type:markdown id: tags:
Equations can be solved in two ways: anaylitically and numerically. The analytical solution is **exact** while a numerical solution requires computational methods to **approximate** the solution. So why would we use anything other than analytical solutions? Well, analytical solutions are difficult/impossible to find for complex equations, specially when the problem involves a complex geometry. This complexity will be treated later in the book. For now, let's illustriate analytical and numerical solutions considering a simple problem.
:::{card}
> Find the value $x$ for $f(x)=3x-2$ when $f(x) = 0$ in the interval $[0,1]$.
**Analytical Solution**
$$ f(x) = 0 \Leftrightarrow 3x-2=0 \Leftrightarrow x = \frac{2}{3}=0.666666666666666..$$
Note that there is no need for any extra computation. You can say that there is only **one computation needed**: the assignment of the $x$ value.
:::
%% Cell type:markdown id: tags:
:::{card} **Numerical Solution**
The code below shows an iterative method to find the x value.
:::
%% Cell type:code id: tags:
``` python
def f(x):
return 3*x-2
dx = 0.001
for i in range(1000):
x = dx*i
if f(x) * f(x+dx) < 0:
print("Number of computations needed to find a solution",i*4)
break
print("Answer x = ", x+dx)
```
%% Output
Number of computations needed to find a solution 2664
Answer x = 0.667
%% Cell type:markdown id: tags:
Note that the 2664 computations needed is highly dependent on the method (this one is not very efficient). Here the search starts at x=0 and increases in steps of 0.001, which also limits the accuracy of the solution.
:::{card} Exercise
Let us now look to another simple example:
> Find the value $x$ for $f(x) = 3\sin(x)-2$ when $f(x) = 0$ in the interval $[0,1]$.
**Analytical Solution**
$$ f(x) = 0 \Leftrightarrow 3\sin(x)-2=0 \Leftrightarrow x = \arcsin\left(\frac{2}{3}\right) = 0.7297276562269663..$$
:::
%% Cell type:markdown id: tags:
:::{card}
**Numerical Solution**
:::
%% Cell type:code id: tags:
``` python
import math
def f(x):
return 3*math.sin(x)-2
dx = 0.001
for i in range(1000):
x = dx*i
if f(x) * f(x+dx) < 0:
print("Number of computations needed to find a solution",i*4)
break
print("Answer x = ", x+dx)
```
%% Output
Number of computations needed to find a solution 2916
Answer x = 0.73
%% Cell type:markdown id: tags:
Note that there is no attempt to solve f(x) directly. Only the function is defined and the exact same steps of the previous problem are followed.
......
%% Cell type:markdown id: tags:
# Taylor Series Expansion
%% Cell type:markdown id: tags:
```{note}
**Important things to retain from this block:**
* Understand how to compute Taylor series expansion of a given function around a given point and its limitations
* Understand how numerical derivatives are derived from TSE
* Understand that the magnitude error depends on the expression used
**Things you do not need to know:**
* Any kind of Taylor expansion of a function by heart
```
%% Cell type:markdown id: tags:
## Definition
The Taylor Series Expansion (TSE) of an arbitrary function $f(x)$ around $x=x_i$ is a polynomial of varying order given by
$$f(x) = f(x_i) + (x-x_i)f'(x_i)+\frac{(x-x_i)^2}{2!}f''(x_i)+ \frac{(x-x_i)^3}{3!} f'''(x_i)+ ...$$
This series is **exact** as long as we include infinite terms. We, however, are limited to a **truncated** expression: an **approximation** of the real function.
Using only 3 terms and defining $\Delta x=x-x_i$, the TSE can be rewritten as
$$f(x_i+\Delta x) = f(x_i) + \Delta x f'(x_i)+\frac{\Delta x^2}{2!}f''(x_i)+ \frac{\Delta x^3}{3!} f'''(x_i)+ \mathcal{O}(\Delta x)^4.$$
Here $\Delta x$ is the distance between the point we "know" and the desired point. $\mathcal{O}(\Delta x)^4$ means that we do not take into account the terms associated to $\Delta x^4$ and therefore **that is the truncation error order**. From here we can also conclude that **the larger the step $\Delta x$, the larger the error**!
Now let's see an example.
---
:::{card} Example
Compute $e^{0.2}$ using 3 terms of TSE around $x_i=0$
```{admonition} Solution
:class: tip, dropdown
We want to evaluate $e^x=e^{x_i+\Delta x}=e^{0.2}$. Therefore, $\Delta x=0.2$. The value of $f(x_i=0)=e^0=1$. For the case of this exponential, the derivatives have the same value $f'(x_i=0)=f''(x_i=0)=f'''(x_i=0)=e^0=1$. The TSE looks like
$$e^{x_i+\Delta x} \approx e^0 + \Delta x e^0 + \frac{\Delta x^2}{2!}e^0 + \frac{\Delta x^3}{3!}e^0 \approx 1 + \Delta x + \frac{\Delta x^2}{2} + \frac{\Delta x^3}{6}\approx 1 + 0.2 + 0.02 + 0.00133 = 1.22133$$
```
Compute the TSE polynomial truncated until $\mathcal{O}(\Delta x)^6$ for $f(x)=\sin(x)$ around $x_i=0$
```{admonition} Solution
:class: tip, dropdown
Applying the definition of TSE:
$$
\sin(x) \approx \sin(x_i+\Delta x) \approx \sin(0) + x\cos(0) - \frac{x^2}{2}\sin(0) - \frac{x^3}{6}\cos(0) + \frac{x^4}{24}\sin(0) + \frac{x^5}{120}\cos(0) +\mathcal{O}(\Delta x)^6
$$
$$
\sin(x) \approx x-\frac{x^3}{6}+\frac{x^5}{120} \text{ wtih an error }\mathcal{O}(\Delta x)^6.
$$
Note that $x$ and $\Delta x$ can be written interchangeably when $x_i=0$ as $\Delta x=x-x_i$. If this would not be the case, for example if $x_i=\pi$ the result is completely different.
```
Compute the TSE polynomial truncated until $\mathcal{O}(\Delta x)^6$ for $f(x)=\sin(x)$ around $x_i=\pi$
```{admonition} Solution
:class: tip, dropdown
Applying the definition of TSE:
$$\sin(x) \approx \sin(x_i+\Delta x) \approx \sin(\pi) + (x-\pi)\cos(\pi) - \frac{(x-\pi)^2}{2}\sin(\pi) - \frac{(x-\pi)^3}{6}\cos(\pi) + \frac{(x-\pi)^4}{24}\sin(\pi) + \frac{(x-\pi)^5}{120}\cos(\pi) +\mathcal{O}(\Delta x)^6$$
$$\sin(x) \approx \sin(x_i+\Delta x) \approx 1 - \frac{(x-\pi)^2}{2} + \frac{(x-\pi)^4}{24} \text{ with an error } \mathcal{O}(\Delta x)^6$$
```
:::
---
%% Cell type:markdown id: tags:
:::{card}
How good is this polynomial? How does the result varies on the number of terms used?
Press `rocket` -->`Live Code` to interact with the figure
:::
%% Cell type:code id: tags:thebe-remove-input-init,auto-execute-page
``` python
import numpy as np
import matplotlib.pyplot as plt
from ipywidgets import widgets, interact
def taylor_plt(order_aprox):
x = np.linspace(-2*np.pi,2*np.pi,100)
plt.plot(x, np.sin(x), label="sin(x)")
if order_aprox == '1st order':
plt.plot(x, x, label = "1st order approximation")
elif order_aprox == '3rd order':
plt.plot(x, x-(1/6*x**3), label = "3rd order approximation")
elif order_aprox == '5th order':
plt.plot(x, x-(1/6*x**3)+(1/120*x**5), label = "5th order approximation")
elif order_aprox == '7th order':
plt.plot(x, x-(1/6*x**3)+(1/120*x**5)-(1/5040*(x)**7), label = "7th order approximation")
plt.ylim(-5,5)
plt.axis('off')
plt.legend()
plt.show();
```
%% Cell type:code id: tags:
``` python
#run to interact with the figure
interact(taylor_plt, order_aprox = widgets.ToggleButtons(options=['1st order', '3rd order', '5th order','7th order']));
```
%% Output
%% Cell type:markdown id: tags:
Relevant conclusions:
- The 1st order, which depends only on the first derivative evaluation, is a straight line.
- The more terms used (larger order) the smaller the error.
- The farther from the starting point (e.g., in the plots $x_i=0$), the larger the error.
%% Cell type:markdown id: tags:
## Use of TSE to define the first derivative
The TSE definition when $\Delta x = x_{i+1} - x_i$ can be rewritten to solve for the first derivative:
$$
f'(x_i)=\frac{f(x_i+\Delta x)-f(x_i)}{\Delta x} - \frac{\Delta x}{2!}f''(x_i)- \frac{\Delta x^2}{3!} f'''(x_i) - ...
$$
By truncating the derivative to avoid the calculation of the second derivative, we find the **forward difference**
$$
f'(x_i)=\frac{f(x_i+\Delta x)-f(x_i)}{\Delta x} + \mathcal{O}(\Delta x).
$$
This is the same definition of the numerical derivative! Now we have the added knowledge that this comes with an error of the order of the step.
<br><br>
The **backward difference** can be found by redefining $-\Delta x=x_i-x_{i-1}$
$$
f'(x_i)=\frac{f(x_i)-f(x_i-\Delta x)}{\Delta x} + \frac{\Delta x}{2!}f''(x_i)- \frac{\Delta x^2}{3!} f'''(x_i) - ...
$$
$$
f'(x_i)=\frac{f(x_i)-f(x_i-\Delta x)}{\Delta x}+ \mathcal{O}(\Delta x).$$
<br><br>
The **central difference** can be found by summing the forward and backward difference expressions of the derivative and dividing it by 2:
$$
f'(x_i)=\frac{f(x_i+\Delta x)-f(x_i-\Delta x)}{2\Delta x} - \frac{\Delta x^2}{3!} f'''(x_i) - ...
$$
$$
f'(x_i)=\frac{f(x_i+\Delta x)-f(x_i-\Delta x)}{2\Delta x}+ \mathcal{O}(\Delta x^2).
$$
The second derivative terms cancel each other, therefore **the order error is the step size squared!** From here, it is obvious that the central difference is more accurate. You can notice it as well intuitively in the figure of the previous chapter.
%% Cell type:markdown id: tags:
:::{card} Exercises
Derive the forward difference for the first derivative $f'(x_i)$ with a first order error $\mathcal{O}(\Delta x)$ using Taylor series
```{admonition} Solution
:class: tip, dropdown
Start from the TSE
(tip: first derivative and first order error (1+1=2), we need to do a TSE up truncated until the 2nd order):
$$
f(x_i+\Delta x) = f(x_i) + \Delta x f'(x_i)+ \mathcal{O}(\Delta x)^2
$$
Rearange to get:
$$
- \Delta x f'(x_i)= -f(x_i+\Delta x)+ f(x_i)+\mathcal{O}(\Delta x)^2
$$
Divide by $ - \Delta x $:
$$
f'(x_i)= \frac{f(x_i+\Delta x)- f(x_i)}{\Delta x } + \mathcal{O}(\Delta x)
$$
```
Use taylor series to derive the backward difference for the first derivative $f'(x_i)$ with a first order error $\mathcal{O}(\Delta x)$
```{admonition} Solution
:class: tip, dropdown
Start from the TSE
(tip: first derivative and first order error, 1+1=2, we need to do a TSE truncated until the 2nd order):
$$f(x_i-\Delta x) = f(x_i) - \Delta x f'(x_i)+ \mathcal{O}(\Delta x)^2$$
Rearange to get:
$$ \Delta x f'(x_i)= -f(x_i-\Delta x)+ f(x_i)+\mathcal{O}(\Delta x)^2 $$
Divide by $ \Delta x $:
$$ f'(x_i)= \frac{ f(x_i)-f(x_i+\Delta x)}{\Delta x } + \mathcal{O}(\Delta x) $$
```
Use taylor series to derive the Central difference for the first derivative $f'(x_i)$ with a 2nd order error $\mathcal{O}(\Delta x)^2$
```{admonition} Solution
:class: tip, dropdown
Start from the TSE of $f(x_i+\Delta x)$ and $f(x_i-\Delta x)$
(tip: first derivative and second order error (1+2=3), we need to do a TSE truncated until the 3rd order):
$$
f(x_i+\Delta x) = f(x_i) + \Delta x f'(x_i)+ \frac{\Delta x^2}{2} + f''(x_i)\mathcal{O}(\Delta x)^3 \hspace{5mm} (1)
$$
$$
f(x_i-\Delta x) = f(x_i) - \Delta x f'(x_i)+ \frac{\Delta x^2}{2} - f''(x_i)\mathcal{O}(\Delta x)^3 \hspace{5mm} (2)
$$
Rearrange both equations:
$$
\Delta x f'(x_i)=f(x_i+\Delta x) - f(x_i) - \frac{\Delta x^2}{2} - f''(x_i)\mathcal{O}(\Delta x)^3 \hspace{5mm} (3)
$$
$$
\Delta x f'(x_i) = - f(x_i-\Delta x)+ f(x_i) + \frac{\Delta x^2}{2} - f''(x_i)\mathcal{O}(\Delta x)^3 \hspace{5mm} (4)
$$
Take the combination $2\Delta x f'(x_i) + \Delta x f'(x_i)$ by adding equation 3 and 4:
$$
2\Delta x f'(x_i) = f(x_i+\Delta x) - f(x_i-\Delta x) - f''(x_i)\mathcal{O}(\Delta x)^3 \hspace{5mm} (4)
$$
Divide by $2\Delta x $ :
$$
f'(x_i) = \frac{f(x_i+\Delta x) - f(x_i-\Delta x)}{2\Delta x } - f''(x_i)\mathcal{O}(\Delta x)^2 \hspace{5mm} (4)
$$
:::
%% Cell type:markdown id: tags:
## TSE to define second derivatives
There are equations that require second derivatives. The diffusion equation is one of those used in every field of knowledge. The 1-D diffusion equation reads
$$
\frac{\partial f}{\partial t}=v\frac{\partial^2 f}{\partial x^2} \text{ where } v \text{ is the diffusion coefficient.}
$$
For the moment we will use TSE to find **only** a numerical expression of the second derivative $\frac{\partial^2 f}{\partial x^2}$.
The procedure is simple but cumbersome. The general idea is to isolate the second derivative in the TSE without being a dependency on other derivatives. Below you can find more details about the algebraic manipulation (if you are curious) but you do not need to know it. Here is the result:
$$
f''(x_i)=\frac{f(x_i+2\Delta x)-2f(x_i+\Delta x)+f(x_i)}{\Delta x^2}+ \mathcal{O}(\Delta x).
$$
This is the **forward** difference approximation of the second derivative. Two aspects come to mind: one more point is needed to compute the second derivative and the error (of the simplest second derivative) is also of the order of the step. There are also **backward** and **central** approximations of the second derivative (not shown here).
%% Cell type:markdown id: tags:
**Derivation of the forward difference approximation of the second derivative**
First define a TSE for a point two steps farther from $x_i$:
$$
f(x_i+2\Delta x) = f(x_i) + 2\Delta x f'(x_i)+\frac{(2\Delta x)^2}{2!}f''(x_i)+ \mathcal{O}(\Delta x)^3.
$$
Now multiply by two the TSE for a point one step farher from $x_i$:
$$
2f(x_i+\Delta x) = 2f(x_i) + 2\Delta x f'(x_i)+\frac{2\Delta x^2}{2!}f''(x_i) + \mathcal{O}(\Delta x)^3.
$$
By substracting the first expression from the second one the first derivative dissapears:
$$
f(x_i+2\Delta x) - 2f(x_i+\Delta x) = -f(x_i) + \frac{2\Delta x^2}{2!}f''(x_i)+ \mathcal{O}(\Delta x)^3.
$$
By solving for $f''$ we obtain the **forward** expression:
$$
f''(x_i)=\frac{f(x_i+2\Delta x)-2f(x_i+\Delta x)+f(x_i)}{\Delta x^2}+ \mathcal{O}(\Delta x).
$$
%% Cell type:markdown id: tags:
# Higher-accuracy Finite-Difference Approximations
## Higher-accuracy Finite-Difference Approximations
So far we have found expressions with a relative large magnitude error $\mathcal{O}(\Delta x)$ with the exception of the central difference approximation. Sometimes a higher accuracy is desired for which better expressions can be found. The procedure is similar to the algebraic manipulation to find the **forward** approximation of the second derivative: a series of TSE are defined at varying distances from $x_i$ and after algebraic manipulation a more accurate expression is found. For example, the **forward** approximation of the first derivative:
$$
f'(x_i)=\frac{-f(x_i+2\Delta x)+4f(x_i+\Delta x)-3f(x_i)}{2\Delta x}+ \mathcal{O}(\Delta x^2).
$$
The error magnitude has improved to $\mathcal{O}(\Delta x^2)$ at the expense of using one more point. The accuracy can be even better by using more points. It is important to note that central differences are more accurate than forward and backward differences when using the same number of points.
%% Cell type:markdown id: tags:
:::{card} Exercise
Derive a first derivative $f('x)$ with an 2nd error order $\mathcal{O}(\Delta x^2)$ finite-difference equations, using the following nodes $x_i$, $x_i+\Delta x$ and $x_i+2\Delta x$ or $x_i$, $x_{i+1} x$ and $x_{i+2}$
The finite-difference equations will have the following form:
$$
f'(x_i)= \frac{\alpha f(x_i)+ \beta f(x_i+\Delta x) + \gamma f(x_i+2\Delta x)}{\Delta x} + \mathcal{O}(\Delta x^2)
$$
Use the Taylor series to find $\alpha$, $\beta$ and $\gamma$
```{admonition} Solution
:class: tip, dropdown
Taylor series for $f(x_i+\Delta x)$ and $f(x_i+2\Delta x)$ (Tip: first order derivative and second error order, 1+2=3, meaning a truncation until 3rd order ):
$$
f(x_i+\Delta x) = f(x_i) + \Delta x f'(x_i)+\frac{(\Delta x)^2}{2!}f''(x_i)+ \mathcal{O}(\Delta x)^3.
$$
$$
f(x_i+2\Delta x) = f(x_i) + 2\Delta x f'(x_i)+\frac{(2\Delta x)^2}{2!}f''(x_i)+ \mathcal{O}(\Delta x)^3.
$$
Times $f(x_i+\Delta x)$ by 4 and expand the term $\frac{(2\Delta x)^2}{2!}f''(x_i)$ in TSE $f(x_i+2\Delta x)$:
$$
4f(x_i+\Delta x)= 4f(x_i) + 4\Delta x f'(x_i)+2(\Delta x)^2f''(x_i)+ 4\mathcal{O}(\Delta x)^3.
$$
$$
f(x_i+2\Delta x) = f(x_i) + 2\Delta x f'(x_i)+2(\Delta x)^2f''(x_i)+ \mathcal{O}(\Delta x)^3.
$$
Now take $4f(x_i+\Delta x)-f(x_i+2\Delta x)$:
$$
4f(x_i+\Delta x)-f(x_i+2\Delta x)= 4f(x_i) + 4\Delta x f'(x_i)+2(\Delta x)^2f''(x_i)+ 4\mathcal{O}(\Delta x)^3 - f(x_i) + 2\Delta x f'(x_i)+2(\Delta x)^2f''(x_i)+ \mathcal{O}(\Delta x)^3.
$$
$$
= 3f(x_i)+ 2\Delta x f'(x_i) + 3\mathcal{O}(\Delta x)^3
$$
Rearrange for f'(x_i):
$$
- 2\Delta x f'(x_i) = 3f(x_i)-4f(x_i+\Delta x)+f(x_i+2\Delta x)+ 3\mathcal{O}(\Delta x)^3
$$
Divide by $-2 \Delta x$:
$$
f'(x_i)= \frac{-\frac{3}{2}f(x_i)+2f(x_i+\Delta x)-\frac{1}{2} f(x_i+2\Delta x)}{\Delta x} +\mathcal{O}(\Delta x)^2
$$
***The solution is: $\alpha=-\frac{3}{2}, \beta= 2, \gamma= -\frac{1}{2}$***
```
:::
%% Cell type:markdown id: tags:
......
%% Cell type:markdown id: tags:
### Exercises on Taylor expansion
# Exercises on Taylor expansion
This page shows some exercises on calculation Taylor expansion. If you reload this page, you'll get new values.
Click `rocket` -->`Live Code` to start practising.
%% Cell type:code id: tags:thebe-remove-input-init
``` python
import sympy as sp
import numpy as np
from sympy import pi, latex
from sympy.printing.mathml import mathml
import operator
import ipywidgets as widgets
from IPython.display import display, Latex, display_jpeg, Math, Markdown
sp.init_printing(use_latex=True)
check_equation = lambda eq1, eq2: sp.simplify(eq1 - eq2) == 0
def check_answer(variable_name, expected, comparison=operator.eq):
output = widgets.Output()
correct_output = widgets.Output()
button = widgets.Button(description="Check answer")
show_correct_button = widgets.Button(description="Show correct answer")
def _inner_check(button):
with output:
if comparison(globals()[variable_name], expected):
output.outputs = [{'name': 'stdout', 'text': 'Correct!',
'output_type': 'stream'}]
correct_output.clear_output() # Clear the correct answer display if they got it right
else:
output.outputs = [{'name': 'stdout', 'text': 'Incorrect!',
'output_type': 'stream'}]
def _show_correct_answer(button):
with correct_output:
correct_output.clear_output() # Clear previous outputs
latex(Math(display(f"The correct answer is: {expected}")))
button.on_click(_inner_check)
show_correct_button.on_click(_show_correct_answer)
display(button, output, show_correct_button, correct_output)
```
%% Cell type:markdown id: tags:
## Exercise 1
Calculate the taylor series expension of:
%% Cell type:code id: tags:thebe-remove-input-init
``` python
x, y = sp.symbols('x, y')
a_1 = sp.Integer(np.random.randint(2,6))
b_1 = sp.Integer(np.random.randint(-10,10))
c_1 = sp.Integer(np.random.randint(-5,5))
eq1_original = a_1 * x**2 + b_1*x
eq1_correct = sp.series(eq1_original,x,c_1)
eq1_answer = 0
display(eq1_original)
#display(eq1_correct)
```
%% Output
$\displaystyle 3 x^{2} - 9 x$
%% Cell type:markdown id: tags:
around:
%% Cell type:code id: tags:thebe-remove-input-init
``` python
display(sp.Eq(x,c_1))
```
%% Output
$\displaystyle x = -3$
%% Cell type:markdown id: tags:
Fill in your answer and run the cell before clicking 'Check answer'.
%% Cell type:code id: tags:auto-execute-page,disable-download-page
``` python
eq1_answer =
```
%% Output
Cell In[5], line 1
eq1_answer =
^
SyntaxError: invalid syntax
%% Cell type:code id: tags:thebe-remove-input-init
``` python
check_answer("eq1_answer",eq1_correct, check_equation)
```
%% Output
%% Cell type:markdown id: tags:
## Exercise 2
Calculate the taylor series expension of:
%% Cell type:code id: tags:thebe-remove-input-init
``` python
a_2 = sp.Integer(np.random.randint(1,7))
c_2 = sp.Integer(np.random.randint(-5,5))
eq2_original = a_2*sp.tan(x)
display(eq2_original)
eq2_correct = sp.series(eq2_original,x,c_2*sp.pi,3).removeO()
#display(eq2_correct)
eq2_answer = 0
```
%% Output
$\displaystyle 5 \tan{\left(x \right)}$
%% Cell type:markdown id: tags:
around:
%% Cell type:code id: tags:thebe-remove-input-init
``` python
display(sp.Eq(x,c_2*sp.pi))
```
%% Output
$\displaystyle x = - 2 \pi$
%% Cell type:markdown id: tags:
discard any $O(x^3)$ terms.
Fill in your answer and run the cell before clicking 'Check answer'. Furthermore, use `pi` for $\pi$:
%% Cell type:code id: tags:
``` python
eq2_answer =
```
%% Output
Cell In[9], line 1
eq2_answer =
^
SyntaxError: invalid syntax
%% Cell type:code id: tags:thebe-remove-input-init
``` python
check_answer("eq2_answer",eq2_correct, check_equation)
```
%% Output
%% Cell type:markdown id: tags:
## Exercise 3
Calculate the taylor series expension of:
%% Cell type:code id: tags:thebe-remove-input-init
``` python
a_3 = sp.Integer(np.random.randint(1,10))
c_3 = sp.Integer(np.random.randint(-1,1))
eq3_original = a_3 / (1 - x)
display(eq3_original)
eq3_correct = sp.series(eq3_original,x,c_3,3).removeO()
#display(eq3_correct)
eq3_answer = 0
```
%% Output
$\displaystyle \frac{1}{1 - x}$
%% Cell type:markdown id: tags:
around:
%% Cell type:code id: tags:thebe-remove-input-init
``` python
display(sp.Eq(x,c_3))
```
%% Output
$\displaystyle x = -1$
%% Cell type:markdown id: tags:
discard any $O(x^3)$ terms.
Fill in your answer and run the cell before clicking 'Check answer':
%% Cell type:code id: tags:
``` python
eq3_answer =
```
%% Output
Cell In[13], line 1
eq3_answer =
^
SyntaxError: invalid syntax
%% Cell type:code id: tags:thebe-remove-input-init
``` python
check_answer("eq3_answer",eq3_correct, check_equation)
```
%% Output
%% Cell type:code id: tags:
``` python
```
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment