I am trying to write code that will fit logistic function to set of data. I am using Levenberg-Marquardt routine (as given in Numerical Recipes 3rd edition) and that requires me to supply a function that provides both output value and values for partial derivatives with respect to parameters. Here I run into something odd:
My logistic function is:
I derived and verified partial derivatives:
Then I wrote the following code (yes, it's JavaScript, I apologize):
function logistic(x, params, dyda) {
let diff = (x - params[3]) //(x - x0)
let ex = Math.exp(-params[2] * diff);
let fac = 1 / (1 + ex)
let fac2 = params[1] * ex * fac * fac // B * exp(-k (x - x0)) / (1 + exp(-k (x - x0)))^2
dyda[0] = 1
dyda[1] = fac
dyda[2] = -diff * fac2
dyda[3] = params[2] * fac2
return params[0] + params[1] * fac
}
The function gets array of params
that contain A, B, k and x0 in this order, it returns the value of y and fills supplied array dyda
with partial derivatives.
The problem is that the LM gradient descent didn't converge at all. I noticed that it felt like the routine did not move in the right direction and flipped signs for dyda[2]
and dyda[3]
(i.e. for derivative w.r.t. k and x0) and everything started to work just great.
Did I make a mistake in differentiation? Was the code wrong? What is going on here?
(for testing, I just generated some data with code like let params = [0, 10, 0.1, 50]; for(let i = 0; i < 100; i++) { xx.push(i); yy.push(logistic(i, params, [])) }
and even if I tried initial guess very close to params
LM did not converge -- until I flipped the sign and then it works like a charm)