Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support pr inference in differentiable computation graph #170

Closed
wants to merge 48 commits into from

Conversation

rtjoa
Copy link
Contributor

@rtjoa rtjoa commented Dec 23, 2023

Summary

We update autodiff to represent the log probabilities of Dist{Bools}s symbolically, to train arbitrary loss functions instead of just doing MLE.

In fact, we support "arbitrary interleavings" - computation dependent on log probabilities can be used as flip parameters, to create more symbolic log probabilities, etc.

We also refactor to shrink the interface where possible, clean up BDD differentiation, and add autodiff support for matrices and trig functions.

Main API changes

The core construct we add is the struct LogPr(::Dist{Bool}) <: ADNode.

  • To compute an ADNode containing a LogPr, use compute_mixed rather than compute.
  • To perform inference on a Dist containing a flip whose probability is dependent on a LogPr, use pr_mixed rather than pr.
  • train!(::Valuation, loss::ADNode; epochs, learning_rate) updates a valuation (dict from Vars to values) to minimize loss by GD

We also add basic loss functions, mle_loss and kl_divergence.

Tests

New tests are added to test/autodiff_pr.

@rtjoa rtjoa changed the title Support log probabilities in differentiable computation graph Support probabilities in differentiable computation graph Dec 23, 2023
@rtjoa rtjoa marked this pull request as ready for review December 24, 2023 23:30
@rtjoa rtjoa changed the title Support probabilities in differentiable computation graph Support pr inference in differentiable computation graph Dec 27, 2023
@rtjoa rtjoa closed this Jul 19, 2024
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants