Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

fix weights update issue when training separation model #1791

Open
wants to merge 1 commit into
base: develop
Choose a base branch
from

Conversation

clement-pages
Copy link
Collaborator

BREAKING: remove finetune_wavlm attributes from the PixIT task, and replace it with walm_frozen in Totatonet separation model, to be consistent with what is done for the segmentation model.

@clement-pages
Copy link
Collaborator Author

Merging this PR should fix #1789

@hbredin
Copy link
Member

hbredin commented Nov 20, 2024

@Lebourdais suggested that we set wavlm.requires_grad to False in __init__ instead of using the with torch.no_grad logic. I think it is indeed a better idea as it allows to get a better estimate of the number of trainable parameters (i.e. those for which requires_grad is set to True.

@clement-pages can you update your PR to make this change and also update the SSeRiouSS model and diarization task accordingly?

@clement-pages
Copy link
Collaborator Author

@clement-pages can you update your PR to make this change and also update the SSeRiouSS model and diarization task accordingly?

OK, I will update it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants