I wish to replace my custom gradient descent algorithm with an optimized algorithm. The solution contains a cost function which can calculate the cost and the gradient.
Code:
public void function1_grad(double[] x, ref double func, double[] grad, object obj)
{
NNCostFunction costFunction = (NNCostFunction)obj;
ILArray<double> parameters = ILMath.array(x);
ResultCostFunction result = costFunction.costFunction(parameters);
func = result.Cost; //double
grad = result.Gradient.ToArray(); //double[]
}
One concept I don't understand is why the gradient is not set as output. When trying to optimize the function it runs the functions one time only (while not changing the initial parameters), while the stopping criteria is set to 50 iterations.
Code:
double epsg = 0;
double epsf = 0;
double epsx = 0;
int maxits = 50;
alglib.minlbfgsstate state;
alglib.minlbfgsreport rep;
alglib.minlbfgscreate(x.Length, 3, x, out state);
alglib.minlbfgssetcond(state, epsg, epsf, epsx, maxits);
alglib.minlbfgsoptimize(state, function1_grad, null, costFunction); //cost function only is called once
alglib.minlbfgsresults(state, out x, out rep);
System.Console.WriteLine(parameters[r(0,20),0]);
parameters.a = ILMath.array(x);
System.Console.WriteLine(parameters[r(0, 20),0]); //results are similar while they should differ
My guess is that the gradient from the cost function is not used. The question is how can I modify my code so it does use the gradient?
Thanks