Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Parameter tracking by PyTorch #16

Open
mlnpapez opened this issue Aug 12, 2024 · 3 comments
Open

Parameter tracking by PyTorch #16

mlnpapez opened this issue Aug 12, 2024 · 3 comments

Comments

@mlnpapez
Copy link

Hi,

Do you plan to add tracking of the PC's parameters by PyTorch? For example, a switch that would change the requires_grad of the PC's parameters from False to True. I would like to use the optimizers from torch.optim to train your PCs.

Best regards,
Milan

@liuanji
Copy link
Member

liuanji commented Aug 15, 2024

Hi Milan,

Yes, this is on our TODO list. However, there are a few problems such as the parameters are currently stored in probability space yet we want to optimize log probabilities when using gradient-based methods. We need to figure out the best way to do it. We are also very open to suggestions from your side.

Best,
Anji

@mjojic
Copy link

mjojic commented Oct 2, 2024

Does this mean that the sgd kernel is not going to work well if I provide the correct supporting code? I need to do a gradient descent over the PC's parameters for some work, and am trying to find the simplest way to do this.

@liuanji
Copy link
Member

liuanji commented Oct 30, 2024

Sorry for the late reply. That is doable with the correct supporting code. Essentially after performing one forward and one backward pass, pc.param_flows will store the gradients of the output log-likelihoods with respect to the log-parameters specified by pc.params (a similar analogy applies to parameters of input nodes). Could you share more details about your goal? For example, what's the input distribution you want to use? I can then help with some supporting code.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants