Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

tied embeddings saving with qwen2-vl #1450

Open
Nazzaroth2 opened this issue Dec 19, 2024 · 0 comments
Open

tied embeddings saving with qwen2-vl #1450

Nazzaroth2 opened this issue Dec 19, 2024 · 0 comments

Comments

@Nazzaroth2
Copy link

When I'm training qwen2-vl with lm_head and embed_tokens in target_modules for continued pretraining I get this user warning:

UserWarning: Model with tie_word_embeddings=True and the tied_target_modules=['lm_head'] are part of the adapter. This can lead to complications, for example when merging the adapter or converting your model to formats other than safetensors. See for example huggingface/peft#2018

I read the peft issue but the thread is fairly stale now and doesn't seem like a quick fix is on the way any time soon.

Has anyone else experience with tied embedding training and knows how "bad" this issue is exactly? And maybe manual fixes to save the embedding adapters correctly (assuming the saving is wrong in the first place and not perfectly fine already)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant