forum.alglib.net http://forum.alglib.net/ |
|
ALGLIB and multithreading http://forum.alglib.net/viewtopic.php?f=2&t=62 |
Page 1 of 1 |
Author: | menschmaschine [ Fri Sep 17, 2010 11:24 am ] |
Post subject: | ALGLIB and multithreading |
Hello! I read that ALGLIB does not support multithreading. What does this mean? Does it mean that ALGLIB does not make use of multithreading to speed up calculations? Or does it mean that it is impossible to execute several ALGLIB calculations in parallel threads? I am not a computer scientist and would appreciate if someone could give a short explanation on this topic. Kind regards, Christian |
Author: | Sergey.Bochkanov [ Fri Sep 17, 2010 12:44 pm ] |
Post subject: | Re: ALGLIB and multithreading |
First one. ALGLIB can't use multiple cores, but nothing prevents you from running several threads (manually) and calling ALGLIB functions from them. Each function will work separately from others, called in parallel, as long as you share no data between them. See example below. Code: thread1:
alglib_function1(matrix1) thread2: alglib_function1(matrix2) everything OK, first thread works with matrix1, second one works with matrix2. No data shared between threads, no problems. |
Author: | menschmaschine [ Fri Sep 17, 2010 2:26 pm ] |
Post subject: | Re: ALGLIB and multithreading |
Thank you. That is what I want to do. |
Author: | Kir [ Wed Jun 29, 2016 10:51 am ] |
Post subject: | Re: ALGLIB and multithreading |
Hello, friends! Probably I repeat topic question but I don’t know was there NNE (Neural networks ensemble) in Alglib when this theme was created. I use functions: alglib::mlpecreatec1(), alglib::mlpetraines(), alglib::mlpeserialize(), alglib::mlpeunserialize(), alglib::mlpeprocess(), alglib_impl::_mlpensemble_clear() in my code and I am going to trains several different separate neural network ensembles simultaneously in several threads running manually. I hope NNE functions are able to work separately from others, called in parallel, so as all other Alglib functions ? Waiting for a reply. Thank you in advance. Kind regards, Kirill P.S. I use Alglib Free Edition v.3.8.2 (C++) now and intend to migrate to 3.10.0. |
Author: | Sergey.Bochkanov [ Thu Jun 30, 2016 10:33 am ] |
Post subject: | Re: ALGLIB and multithreading |
As long as different threads do not share ALGLIB objects, it is completely safe - ALGLIB has no global thread-unsafe variables. |
Author: | Kir [ Thu Jun 30, 2016 10:47 am ] |
Post subject: | Re: ALGLIB and multithreading |
Thank you a lot! You have dispelled my doubts. |
Author: | Kir [ Sun Jul 03, 2016 2:23 pm ] |
Post subject: | Re: ALGLIB and multithreading |
Dear Sergey! My task is regression analisys. I have some hundreds of data sets and I train the same quantity of neural networks essembles on them (function mlpetraines() ). There are the same quantity of input vectors for each data set of course. With function mlpeprocess() I define the result for each vector. Finally I define the maximum of all the results. I use multi-threaded calculation using the CreateThread() function in C++. And depend on the numbers of using threads the results were different! Although the data sets and the vectors were equals! The different was the order of calculations only. I assume that the reason is the different initial network weights that the function mlpecreate1() sets to random values. I made an experiment. In single thread I twice call mlpetraines() with the same data set (training set) and mlpeprocess() with the same vector and have got the different results: Code: alglib::mlpecreate1(nin, nhid, nout, ensemblesize, ensemble_A); alglib::mlpetraines(ensemble_A, XY_set, npoints, decay, restarts, info, rep); alglib::mlpeprocess(ensemble_A, x_Vector, y_Out); Result_1 = y_Out(0); // Repeat the same code alglib::mlpecreate1(nin, nhid, nout, ensemblesize, ensemble_A); alglib::mlpetraines(ensemble_A, XY_set, npoints, decay, restarts, info, rep); alglib::mlpeprocess(ensemble_A, x_Vector, y_Out); Result_2 = y_Out(0); Result_2 is not equal to Result_1. Ok, this only confirms my assumption. To order to make the initial random values of the neural net weights the same I remember ensemble_A immediately after function mlpecreate1() the first call and then restore them before the second call of mlpetraines(): Code: alglib::mlpecreate1(nin, nhid, nout, ensemblesize, ensemble_A); alglib::mlpeserialize(ensemble_A, string_mem); alglib::mlpetraines(ensemble_A, XY_set, npoints, decay, restarts, info, rep); alglib::mlpeprocess(ensemble_A, x_Vector, y_Out); Result_1 = y_Out(0); alglib::mlpeunserialize(string_mem, ensemble_A); alglib::mlpetraines(ensemble_A, XY_set, npoints, decay, restarts, info, rep); alglib::mlpeprocess(ensemble_A, x_Vector, y_Out); Result_2 = y_Out(0); Result_2 is not equal to Result_1 again! I do not understand anything. It looks like if one wants to calculate sinus of the some value twice and have got different results. Is not it ? Is it possible to train the ensemble with the same data twice in the single code space and get the same trained ensemble in result? How do you achieve that effect and what is it – a bug or a feature ? |
Author: | Sergey.Bochkanov [ Mon Jul 04, 2016 9:00 am ] |
Post subject: | Re: ALGLIB and multithreading |
Hello! Training functions re-initialize network by random values every time you call them. So, training network twice on same results will return two different results - it is feature, not bug. If you want, you may comment call to mlprandomize() in mlptraines() (not mlpetraines!), it will make training deterministic. |
Author: | Kir [ Mon Jul 04, 2016 9:43 am ] |
Post subject: | Re: ALGLIB and multithreading |
Hello, Sergey! Everything is clear now. However it seems to me that to comment the call of mlprandomize() in mlptraines() is hardly good idea because every restart of the training will be without any initial sets of network in this case. Am I not right ? |
Author: | Sergey.Bochkanov [ Mon Jul 04, 2016 2:25 pm ] |
Post subject: | Re: ALGLIB and multithreading |
hmmm... yes, you are right! From the other side... if you really need to get deterministic behavior, you may seed system RNG with srand(12345) prior to calling training function. However, effect of this call will be long-lasting and may influence other parts of your program. |
Page 1 of 1 | All times are UTC |
Powered by phpBB © 2000, 2002, 2005, 2007 phpBB Group http://www.phpbb.com/ |