-
Notifications
You must be signed in to change notification settings - Fork 221
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Use Mersenne Twister from C++ standard library #1559
Conversation
6x speed improvement definitely good! I guess reproducibility is probably important for some people, so yeah I would agree with your solution. |
Yeah, while writing this I convinced myself that this would be good to have ;-) I'll try to refactor a code a bit so that the RNG is a bit more encapsulated than what it is now – this should also make it easier to switch to a different RNG in the future. |
This assures reproducible random number sequences with previous versions
Replace tabs by spaces
This now looks good from my side, and it reproduces the same random numbers as before. When we generate G = NeuronGroup(10000, """x : 1
y : 1""")
G.run_regularly('x = rand()', name="rand")
G.run_regularly('y = randn()', name="randn")
P = PoissonGroup(10000, rates=10*Hz, name='poisson') still goes from
to
i.e. |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Looks great to me!
When I opened #1483, I did not realize that we have a simple solution – C++11 (not that recent!) comes with PRNGs and it has a Mersenne Twister implementation that is about 6 times as fast as our "radomkit" implementation. So we can delete a lot of code and get faster execution for free 👍
Now, there is one thing that I am not quite sure about: given that we keep using the same algorithm, the quality of the random numbers, the way of how the
seed
function, multi-threading, etc. works stays the same but you can no longer reproduce the same random numbers simply by setting the same seed, since the implementations for generating the uniform/Gaussian distributions from the raw random numbers differ (and even if they happened to be the same, there are no guarantees how they are implemented).As an alternative, we can keep our old
rk_double
andrk_gauss
functions around, and just use the new MT generator, which keeps almost all the performance improvement and allows us to exactly reproduce previous numbers. The cost is that the code gets a bit messier, since we still need to ship our own functions – and getting things to work correctly with threads and seeds is slightly more complicated. But maybe worth it? Being able to reproduce the same random numbers as before would be quite nice to guarantee. I think one could say we imply this feature in our docs:https://brian2.readthedocs.io/en/stable/introduction/compatibility.html#random-numbers
Closes #1483