Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Implicit sequence weights and Implicit factorization weights #122

Open
wants to merge 21 commits into
base: master
Choose a base branch
from

Conversation

nikisix
Copy link
Contributor

@nikisix nikisix commented Jul 24, 2018

This branches off of the sample-weights branch we have a pr going on with right now in order to leverage the new losses logic.

I combined the fit and fit_weighted functions for implicit factorizations, subsume mask_padding into sample_weights and create support for sample weights in to_sequence and implicit sequence.

If you're happy with these changes, you can just merge this branch into master and close out the sample-weights PR without merging.

Copy link
Owner

@maciejkula maciejkula left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the PR is almost there: there is only one comment where I think the current implementation is incorrect. The rest are suggestions on how I think it could be even better.

In general, I would love to see some tests added. The PR touches quite a few parts of the codebase, and it would be lovely to have some additional assurance that things work.

Ideas for tests:

  1. Set all weights to zeros: the model should produce random results.
  2. Set all weights to a very high value: the model should diverge.
  3. Set all weights for a subset of users to zero: the model should produce random results for those users (but reasonable results for other users).

spotlight/sequence/implicit.py Show resolved Hide resolved
spotlight/sequence/implicit.py Show resolved Hide resolved
spotlight/torch_utils.py Show resolved Hide resolved
@@ -15,7 +15,44 @@
from spotlight.torch_utils import assert_no_grad


def pointwise_loss(positive_predictions, negative_predictions, mask=None):
def _weighted_loss(loss, sample_weights=None, mask=None):
Copy link
Owner

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Seeing that we have sample weights, could we make it so that we do not have the mask argument at all? We would just make sure the weights are all zero where the mask is currently zero. This is already accomplished by the implementation you added to the Interactions class: we just need to make sure the weights are always present (if they are None, we just make them all ones before we construct the sequences).

I think that would be a good change because, at the moment, the mask and weight arguments co-exist uncomfortably where it's not clear how they interact --- which you also point out in the docstring.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Agreed, just hadn't gotten around to it.

Copy link
Contributor Author

@nikisix nikisix Aug 29, 2018

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Attempted in most recent commit. Please let me know what you think.

Might a little point in the right direction with your excellent test ideas.

@maciejkula maciejkula mentioned this pull request Aug 6, 2018
@nikisix
Copy link
Contributor Author

nikisix commented Aug 30, 2018

Do you have any literature or other references for what's going on in with the transition matrix generator in datasets/synthetic possibly (asking mainly out of curiosity)?

@nikisix
Copy link
Contributor Author

nikisix commented Sep 6, 2018

Ok, added most of the tests you outlined. Please let me know what you think.

@nikisix
Copy link
Contributor Author

nikisix commented Sep 24, 2018

Mind taking another look at this when you can @maciejkula?

@nikisix
Copy link
Contributor Author

nikisix commented Apr 17, 2019

Mind taking another look when you can please?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

3 participants