Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Bag prior weights #76

Open
amueller opened this issue Dec 28, 2023 · 2 comments
Open

Bag prior weights #76

amueller opened this issue Dec 28, 2023 · 2 comments

Comments

@amueller
Copy link
Contributor

Hi!
I wanted to check in about the weights in the bag prior. It seems they are 0.961 for the MLP and 0.038 for the GP. Does that seem right? That's what I got after MCMC'ing out the double sampling.

@SamuelGabriel
Copy link
Contributor

That can very well be. The GP prior is not very helpful for most datasets, and the MLP prior does include the SCM setting if I remember correctly, so that would check out. And what do you mean by MCMC'ing out? Shouldn't it just be the softmax over this line?

prior_bag_hyperparameters = {'prior_bag_get_batch': (get_batch_gp, get_batch_mlp)

i.e. softmax(2,1)[0] = .73?

@amueller
Copy link
Contributor Author

amueller commented Feb 22, 2024

Hm I don't think that line is used, because there's this:

prior_bag_priors_p = [1.0] + [hyperparameters[f'prior_bag_exp_weights_{i}'] for i in range(1, len(prior_bag_priors_get_batch))]

which overwrites the parameter in the chaining of get_batch. So you're first sampling a number x between 2 and 10 and then you're sampling from the softmax(1, x) That is what I did the sampling over (actually MC'ing, not MCMC'ing, sorry)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants