Hi, I am trying to use the C# implementation of LM and I got a bit confused about how it is supposed to be set up. I have looked at the sample code for the basic Vector of functions version:
http://www.alglib.net/translator/man/manual.csharp.html#example_minlm_d_vFrom what I gather it is working out what values x[0] and x[1] should be for the following two equations:
y0 = 10 * (x[0]+3)^2
y1 = (x[1]-3)^2
with starting parameters x[0] = 0 and x[1] = 0.
I don't see how the result of x[0] = -3 and x[1] = 3 is being calculated however. Surely you must have a value for y as an observed value to input to the algorithm solve? Furthermore shouldn't there be a known parameter for instance a time parameter for the two equations? How would you be able to optimise to find x[0] and x[1] with the function vector:
y0 = 10* (x[0]+3) ^ 2 * time
y1 = (x[1]-3)^2 * time
Using the observed y values at a certain time?