Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Contrastive learning transformer for PL Network #426

Open
rflperry opened this issue Jan 28, 2021 · 2 comments
Open

Contrastive learning transformer for PL Network #426

rflperry opened this issue Jan 28, 2021 · 2 comments
Labels
feature a feature we want and know how to do ndd Neuro Data Design

Comments

@rflperry
Copy link
Member

rflperry commented Jan 28, 2021

Background

Currently, the progressive learning network transformer is learned as a byproduct of optimizing the softmax objective loss for classification accuracy.

Contrastive loss (reference 1, reference 2) explicitly learns a transformer, penalizing samples of different classes that are close to one another (see also margin loss). This may be better suited for using kNN later, and shows state of the art accuracy.

See official implementation here.

Proposed feature: implement contrastive loss.

Validate by comparing accuracy and then compare transfer efficiency. Determine the best form of contrastive loss to use.

Prior experiments

I had fiddled with this a bit. See attempted contrastive learning implementation here and preliminary transfer efficiency benchmarks for variations in the contrastive loss layers.

@rflperry rflperry added the feature a feature we want and know how to do label Feb 5, 2021
@PSSF23 PSSF23 added the ndd Neuro Data Design label Aug 24, 2021
@Dante-Basile
Copy link

Hi, I am looking into addressing this issue for NDD 2021-2022.

@waleeattia
Copy link

Hello, I'm also looking into this issue for NDD 2021

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
feature a feature we want and know how to do ndd Neuro Data Design
Projects
None yet
Development

No branches or pull requests

4 participants