Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Updated gated models snippet #765

Open
gary149 opened this issue Jun 19, 2024 · 22 comments
Open

Updated gated models snippet #765

gary149 opened this issue Jun 19, 2024 · 22 comments
Assignees

Comments

@gary149
Copy link
Collaborator

gary149 commented Jun 19, 2024

For gated models add a comment on how to create the token + update the code snippet to include the token (edit: as a placeholder)

image
@julien-c
Copy link
Member

hmm not include the token, but mention to do huggingface-cli login instead

@julien-c
Copy link
Member

cc @osanseviero

@pcuenca
Copy link
Member

pcuenca commented Jun 19, 2024

I agree with a mention to huggingface-cli login

@julien-c
Copy link
Member

and btw the error message if not logged in should already prompt you to huggingface-cli login

@mishig25
Copy link
Collaborator

mishig25 commented Jun 19, 2024

and btw the error message if not logged in should already prompt you to huggingface-cli login

let me know, I can create an issue on transformers

@gary149
Copy link
Collaborator Author

gary149 commented Jun 19, 2024

btw it should work with diffusers too

@julien-c
Copy link
Member

i meant it must already be the case AFAIK

@julien-c
Copy link
Member

and yes it's not library specific

@mishig25
Copy link
Collaborator

mishig25 commented Jun 19, 2024

on transformers:

/usr/local/lib/python3.10/dist-packages/huggingface_hub/file_download.py:1132: FutureWarning: `resume_download` is deprecated and will be removed in version 1.0.0. Downloads always resume when possible. If you want to force a new download, use `force_download=True`.
  warnings.warn(

OSError: You are trying to access a gated repo.
Make sure to have access to it at https://huggingface.co/meta-llama/Meta-Llama-3-8B.
401 Client Error. (Request ID: Root=1-6672b9e2-4e7561551ca6a7d17bd1e5a8;718527cf-0693-4b8f-ad60-8a60fb2cf76c)

Cannot access gated repo for url https://huggingface.co/meta-llama/Meta-Llama-3-8B/resolve/main/config.json.
Access to model meta-llama/Meta-Llama-3-8B is restricted. You must be authenticated to access it.

@mishig25
Copy link
Collaborator

on diffusers:

HTTPError: 401 Client Error: Unauthorized for url: https://huggingface.co/api/models/stabilityai/stable-diffusion-3-medium

The above exception was the direct cause of the following exception:

GatedRepoError                            Traceback (most recent call last)
GatedRepoError: 401 Client Error. (Request ID: Root=1-6672ba64-6d0027fd0aff7b1b00e1e485;ea7fdd17-c3a6-4c4f-b0a2-e6d37f3b5a6f)

Cannot access gated repo for url https://huggingface.co/api/models/stabilityai/stable-diffusion-3-medium.
Access to model stabilityai/stable-diffusion-3-medium is restricted. You must be authenticated to access it.

The above exception was the direct cause of the following exception:

OSError                                   Traceback (most recent call last)
[/usr/local/lib/python3.10/dist-packages/diffusers/pipelines/pipeline_utils.py](https://localhost:8080/#) in download(cls, pretrained_model_name, **kwargs)
   1546             else:
   1547                 # 2. we forced `local_files_only=True` when `model_info` failed
-> 1548                 raise EnvironmentError(
   1549                     f"Cannot load model {pretrained_model_name}: model is not cached locally and an error occurred"
   1550                     " while trying to fetch metadata from the Hub. Please check out the root cause in the stacktrace"

OSError: Cannot load model stabilityai/stable-diffusion-3-medium: model is not cached locally and an error occurred while trying to fetch metadata from the Hub. Please check out the root cause in the stacktrace above.

@julien-c
Copy link
Member

Ok, so maybe we add Please log in using "huggingface-cli login" or similar mechanism to that error message:

Access to model stabilityai/stable-diffusion-3-medium is restricted. You must be authenticated to access it.
then?

(i think that error string is in moon)

@julien-c
Copy link
Member

but orthogonally, i'm ok to add a huggingface-cli login command to the snippets as discussed in #765 (comment)

@julien-c
Copy link
Member

julien-c commented Jun 19, 2024

oh and btw maybe let's also start to add brew install huggingface-cli on the line before as a good easy way to install it?

(cc @Wauplin too)

EDIT: what's an easy alternative for Windows?

@Vaibhavs10
Copy link
Member

pip install huggingface_hub or pip install --upgrade huggingface_hub would be the only way AFAIK for Windows atm.

@Vaibhavs10
Copy link
Member

@mfuntowicz
Copy link
Member

For windows I can help on having a winget install path https://github.com/microsoft/winget-cli

@julien-c
Copy link
Member

would it be winget install huggingface-cli?

@mfuntowicz
Copy link
Member

Tentatively yes - this or HuggingFace.Cli I think there is a short and long reference(s)

@Wauplin
Copy link
Contributor

Wauplin commented Jun 26, 2024

@mfuntowicz I'm not knowledgeable on winget specifically but please let me know if I can be of any assistance packaging huggingface-cli :)

@goldingdamien
Copy link

Is "huggingface-cli login" the recommended way to login for transformers.js? For example, if an "Error: Forbidden access to file" error is received.
There are no JS functions provided to log in with an HF_TOKEN?

@pcuenca
Copy link
Member

pcuenca commented Oct 19, 2024

Hello @goldingdamien! For transformers.js, I think you need to follow these steps (server environments): https://huggingface.co/docs/transformers.js/en/guides/private

@goldingdamien
Copy link

@pcuenca Thank you. Yes, that was the first issue. I also had to request access. Going to the link in the error directly made that clear.
Regards.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants