You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Do you plan to add tracking of the PC's parameters by PyTorch? For example, a switch that would change the requires_grad of the PC's parameters from False to True. I would like to use the optimizers from torch.optim to train your PCs.
Best regards,
Milan
The text was updated successfully, but these errors were encountered:
Yes, this is on our TODO list. However, there are a few problems such as the parameters are currently stored in probability space yet we want to optimize log probabilities when using gradient-based methods. We need to figure out the best way to do it. We are also very open to suggestions from your side.
Does this mean that the sgd kernel is not going to work well if I provide the correct supporting code? I need to do a gradient descent over the PC's parameters for some work, and am trying to find the simplest way to do this.
Sorry for the late reply. That is doable with the correct supporting code. Essentially after performing one forward and one backward pass, pc.param_flows will store the gradients of the output log-likelihoods with respect to the log-parameters specified by pc.params (a similar analogy applies to parameters of input nodes). Could you share more details about your goal? For example, what's the input distribution you want to use? I can then help with some supporting code.
Hi,
Do you plan to add tracking of the PC's parameters by PyTorch? For example, a switch that would change the
requires_grad
of the PC's parameters fromFalse
toTrue
. I would like to use the optimizers fromtorch.optim
to train your PCs.Best regards,
Milan
The text was updated successfully, but these errors were encountered: