Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Using SupContrast loss for non-image application #121

Open
mossishahi opened this issue Oct 12, 2022 · 1 comment
Open

Using SupContrast loss for non-image application #121

mossishahi opened this issue Oct 12, 2022 · 1 comment

Comments

@mossishahi
Copy link

Hey

I am about to use the SupConLoss for my specific application:
As I am embedding some graphs and would like to make the embeddings of similar graphs to get more similar, taking advantage of this loss might be helpful. but I have faced some problems:

  • the input of forward function should have a shape like: bsz, n_views, n_features however, I don't have copies for each embedding (equivalent to "images' crops"). So: my model's output has a shape like: (bsz, d_embedding) and I assumed I can match it to you loss function by unsqueeze(1) which makes it shaped like: (bsz, 1, d_embedding)
  • The above mentioned input, results in nan
    As I have tracked the variables in your loss function:
  • Also as I use mask, it has one and zero values. mask(i,j)=1 which means to me that the i'th and j'th embedding in my batch should get close.
    • In case of using 'label' argument, I am about to make those embeddings with same label to get close to each other. (mask is then automatically created)

Are these assumptions right?

appreciate

@meiling-fdu
Copy link

hello, have you solved this problem?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants