forum.alglib.net
http://forum.alglib.net/

Different number of steps of Levenberg-Marquardt
http://forum.alglib.net/viewtopic.php?f=2&t=4319
Page 1 of 1

Author:  simon [ Sat Apr 11, 2020 1:20 pm ]
Post subject:  Different number of steps of Levenberg-Marquardt

Hello.

I use the Levenberg-Marquardt algorithm with linear inequalities for least squares fitting. The target function has an analytic jacobian, but the optimizer's behavior seems strange to me: optimization is faster if I use a numerical derivative. I will study the problem in detail and give technical details at the end.

The calculation of the target function with the calculation of jacobian lasts 15 times longer than the calculation without jacobian. And this is good, because the optimization takes place in the 40-dimensional space. So, the optimization should be faster with jacobian. But it's not. With different parameters, optimization lasts 35% longer with an analytic derivative. For an unknown reason, the optimizer makes more steps with the analytic derivative. I have performed a hundred solutions of the regression problem and got that the version with the numerical derivative makes 181 steps on the average, with the standard deviation 87. Of 100 times, two stopped because they reached the limit of 500 steps. With analytic jacobian enabled, the optimizer makes 185 steps on average, with a standard deviation of 166. Out of 100 times, 16 stopped at 500 steps.

I use rough stopping criterion alglib::minlmsetcond(state_, 0.0001, 500) because it is faster. If you put step norm 0.e-12, it gets worse. The numerical derivative makes 206 steps with a standard deviation of 87 and 3/100 end in step 500. The analytical jacobian takes 202 steps with a standard deviation of 178 and 21/100 ending in step 500.

Why is the number of steps different? Why does analytical jacobian need more steps? 500-iterations stop at the right minimum, by the way. I also checked the jacobian with alglib::minlmoptguardgradient(), so it works correctly.

Profiling shows that alglib_impl::minlmiteration takes a lot of CPU time. You can expect the optimizer to take more steps, so the target function with the analytic jacobian will take a larger share than the function with the numerical derivative. But it is not. With a numerical derivative, minlmiteration takes 50% of the CPU, and with an analytic jacobian it takes 60%. And that's weird, too.

I have set bound-box and linear inequality constraints A*x < 0. Is it normal that minlmiteration works so long? In the target function, I fill and solve 130 linear systems AX = BY. 100 of them are 1x1 and the rest are sparse. The largest matrix is A 253x253. I know that free alglib doesn't have vectorization, but does the optimizer should work longer than this target function? At the moment, one regression solution takes about 1 second, and minlmiteration takes 0.6 sec of it.


Technical characteristics:
Alglib::minlmcreatevj() method
Generally speaking, optimization is launched in each thread, but if you leave one thread, nothing changes. All measurements are performed in single-threaded mode.
alglib::minlmsetcond(state, 0.0001, 500);
alglib::minlmsetacctype(state, 1);
If a numerical derivative is used, the step of differentiation is 0.001.
bound-box restrictions and linear inequality A * x < 0.
Optimization space has dimension 40. All coordinates are small and of the same order: 0.001 < x < 100
gcc: -O2 -march-native

The code is not simple, but if needed, it's here:
Running the optimizier:
https://github.com/SteshinSS/khnum/blob/master/src/solver/solver.cpp#L54

Setting parameters:
https://github.com/SteshinSS/khnum/blob/master/src/solver/solver.cpp#L88

Running the alglib::minlmoptimize()
https://github.com/SteshinSS/khnum/blob/master/src/solver/solver.cpp#L141

Thank you in advance for your answer.

Page 1 of 1 All times are UTC
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group
http://www.phpbb.com/