You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I am about to use the SupConLoss for my specific application:
As I am embedding some graphs and would like to make the embeddings of similar graphs to get more similar, taking advantage of this loss might be helpful. but I have faced some problems:
the input of forward function should have a shape like: bsz, n_views, n_features however, I don't have copies for each embedding (equivalent to "images' crops"). So: my model's output has a shape like: (bsz, d_embedding) and I assumed I can match it to you loss function by unsqueeze(1) which makes it shaped like: (bsz, 1, d_embedding)
The above mentioned input, results in nan
As I have tracked the variables in your loss function:
Also as I use mask, it has one and zero values. mask(i,j)=1 which means to me that the i'th and j'th embedding in my batch should get close.
In case of using 'label' argument, I am about to make those embeddings with same label to get close to each other. (mask is then automatically created)
Are these assumptions right?
appreciate
The text was updated successfully, but these errors were encountered:
Hey
I am about to use the SupConLoss for my specific application:
As I am embedding some graphs and would like to make the embeddings of similar graphs to get more similar, taking advantage of this loss might be helpful. but I have faced some problems:
bsz, n_views, n_features
however, I don't have copies for each embedding (equivalent to "images' crops"). So: my model's output has a shape like: (bsz, d_embedding) and I assumed I can match it to you loss function byunsqueeze(1)
which makes it shaped like:(bsz, 1, d_embedding)
nan
As I have tracked the variables in your loss function:
mask(i,j)=1
which means to me that the i'th and j'th embedding in my batch should get close.Are these assumptions right?
appreciate
The text was updated successfully, but these errors were encountered: