Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Fix quantize-nbits flag #316

Open
wants to merge 1 commit into
base: main
Choose a base branch
from
Open

Conversation

nosferatu500
Copy link

Thank you for your interest in contributing to Core ML Stable Diffusion! Please review CONTRIBUTING.md first. If you would like to proceed with making a pull request, please indicate your agreement to the terms outlined in CONTRIBUTING.md by checking the box below. If not, please go ahead and fork this repo and make your updates.

We appreciate your interest in the project!

Do not erase the below when submitting your pull request:
#########

  • I agree to the terms outlined in CONTRIBUTING.md

With transformers 4.29 (current) it is not possible to use `quantize-nbits` flag.
@nosferatu500
Copy link
Author

nosferatu500 commented Feb 28, 2024

With transformers 4.29 (current) it is not possible to use quantize-nbits flag (like this). With version 4.34.1 the issue is gone

@SpiraMira
Copy link

@nosferatu500 I still get :

ValueError: Input X contains infinity or a value too large for dtype('float64’).

created a new mamba python 3.8 environment but pip install -e . installs 4.39.3.

what is your environment like?

@nosferatu500
Copy link
Author

@SpiraMira The version I mentioned is the only one that doesn't have problems with ml-stable-diffusion.

@SpiraMira
Copy link

@SpiraMira The version I mentioned is the only one that doesn't have problems with ml-stable-diffusion.

@nosferatu500 Yes, you’re right (hope this gets merged quickly)

I had to rebuild a mamba python 3.8 environment for transformers to “stick” to 4.34.1. I think my issue was my python 3.10 environment defaulting to 4.39.3. downgrading from there to 4.34.1 was problematic (?). Are you running a python 3.10+ environment?

thanks.

@nosferatu500
Copy link
Author

@SpiraMira I'm on 3.8

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants