I am trying to set the step function using scipy.optimize.leastsq. Consider the following example:
import numpy as np
from scipy.optimize import leastsq
def fitfunc(p, x):
y = np.zeros(x.shape)
y[x < p[0]] = p[1]
y[p[0] < x] = p[2]
return y
errfunc = lambda p, x, y: fitfunc(p, x) - y
x = np.arange(1000)
y = np.random.random(1000)
y[x < 250.] -= 10
p0 = [500.,0.,0.]
p1, success = leastsq(errfunc, p0, args=(x, y))
print p1
parameters - this is the location of the step and level on both sides. It is strange that the first free parameter never changes, if you run this scipy, it will give
[ 5.00000000e+02 -4.49410173e+00 4.88624449e-01]
when the first parameter is optimal when set to 250, and the second to -10.
Does anyone have an idea of why this might not work and how to make it work?
If I run
print np.sum(errfunc(p1, x, y)**2.)
print np.sum(errfunc([250.,-10.,0.], x, y)**2.)
I find:
12547.1054663
320.679545235
where the first number is what the smallest finds, and the second is the value for the actual optimal function that it should find.