Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix implementation of rand #66

Merged
merged 3 commits into from
Oct 18, 2023
Merged

Fix implementation of rand #66

merged 3 commits into from
Oct 18, 2023

Conversation

ablaom
Copy link
Member

@ablaom ablaom commented Oct 15, 2023

Closes #65.

This PR also changes the behaviour of rand(d, args...) to use default_rng in place of
GLOBAL_RNG, mimicking changes made to Random in Julia 1.7.

edit

No, rather we now address #65 by implementing Random.Sampler(rng, ::UnivariateDistribution) and Random.rand(rng, sampler) instead directly overloading each variant of rand([rng, ], ::UnivariateDistribution, args...). This has the effect of automatically using default_rng as the default rng, instead of GLOBAL_RNG, because that how Random overloads Base.rand.

@codecov
Copy link

codecov bot commented Oct 15, 2023

Codecov Report

All modified lines are covered by tests ✅

Comparison is base (66f23a3) 95.85% compared to head (db73962) 96.03%.
Report is 3 commits behind head on dev.

Additional details and impacted files
@@            Coverage Diff             @@
##              dev      #66      +/-   ##
==========================================
+ Coverage   95.85%   96.03%   +0.18%     
==========================================
  Files           8        8              
  Lines         434      429       -5     
==========================================
- Hits          416      412       -4     
+ Misses         18       17       -1     
Files Coverage Δ
src/methods.jl 91.26% <100.00%> (+0.43%) ⬆️

☔ View full report in Codecov by Sentry.
📢 Have feedback on the report? Share it here.

@ablaom
Copy link
Member Author

ablaom commented Oct 15, 2023

@rfourquet Can you confirm that the change from GLOBAL_RNG to default_rng() in this PR is appropriate?

@ablaom
Copy link
Member Author

ablaom commented Oct 17, 2023

@rikhuijzer Do you have time and interest in looking over this?

@rikhuijzer
Copy link
Member

I would suggest to cleanup the symbol loading a bit with these changes:

diff --git a/test/methods.jl b/test/methods.jl
index 83215a2..30516fb 100644
--- a/test/methods.jl
+++ b/test/methods.jl
@@ -1,16 +1,17 @@
 module TestUnivariateFiniteMethods
 
-using Test
-using CategoricalDistributions
-using CategoricalArrays
+import CategoricalDistributions: classes, ERR_NAN_FOUND
 import Distributions
-using StableRNGs
 import Random
-rng = StableRNG(123)
+
+using CategoricalArrays
+using CategoricalDistributions
+using Random: default_rng
 using ScientificTypes
-import Random.default_rng
+using StableRNGs
+using Test
 
-import CategoricalDistributions: classes, ERR_NAN_FOUND
+rng = StableRNG(123)
 
 v = categorical(collect("asqfasqffqsaaaa"), ordered=true)
 V = categorical(collect("asqfasqffqsaaaa"))
@@ -314,7 +315,7 @@ if VERSION >= v"1.7"
         @test [rand(d) for i in 1:30] == samples
 
         Random.seed!(123)
-        samples = rand(Random.default_rng(), d, 3, 5)
+        samples = rand(default_rng(), d, 3, 5)
         Random.seed!(123)
         @test samples == rand(d, 3, 5)
     end

To apply these changes, you can copy them into a file somefile.patch and use

$ git apply mypatch.patch

Copy link
Member

@rikhuijzer rikhuijzer left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Apart from that, LGTM

@ablaom ablaom merged commit 2cc3d3d into dev Oct 18, 2023
5 checks passed
@ablaom ablaom mentioned this pull request Oct 18, 2023
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Fix incomplete implementation of rand(::UnivariateFinite, ...)
2 participants