I am stuck applying scipy.integrate.odeint
to the following very simple ODE:
y(t)/dt = y(t) + t^2 and y(0) = 0
The solution computed by SciPy is incorrect (most likely I am confusing b / c) - in particular, the solution does not meet the initial condition.
import numpy as np
import scipy.integrate
import matplotlib.pyplot as plt
import math
def f(y,t):
return [t**2 + y[0]]
ts = np.linspace(-3,3,1000)
res = scipy.integrate.odeint(f, [0], ts)
def y(t):
return -t**2 - 2*t + 2*math.exp(t) - 2
fig = plt.figure(1, figsize=(8,8))
ax1 = fig.add_subplot(211)
ax1.plot(ts, res[:,0])
ax1.text(0.5, 0.95,'SciPy solution', ha='center', va='top',
transform = ax1.transAxes)
ax1 = fig.add_subplot(212)
ax1.plot(ts, np.vectorize(y)(ts))
ax1.text(0.5, 0.95,'WolframAlpha solution', ha='center', va='top',
transform = ax1.transAxes)
plt.show()
1 : WolframAlpha: "solve dy (t) / dt = t ^ 2 + y (t), y (0) = 0"
Where is my mistake?
source
share