Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

GLM-4V-9B #614

Open
1 of 2 tasks
Yxj-study opened this issue Oct 28, 2024 · 1 comment
Open
1 of 2 tasks

GLM-4V-9B #614

Yxj-study opened this issue Oct 28, 2024 · 1 comment
Assignees

Comments

@Yxj-study
Copy link

System Info / 系統信息

I'm getting the following error when installing the dependencies for GLM-4V-9B. What could be the reason?

ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
torchaudio 2.2.2 requires torch==2.2.2, but you have torch 2.4.0 which is incompatible.
ai2-olmo 0.3.0 requires torch<2.3,>=2.0, but you have torch 2.4.0 which is incompatible.
cached-path 1.6.3 requires huggingface-hub<0.24.0,>=0.8.1, but you have huggingface-hub 0.26.1 which is incompatible.

This is the dependencies:
torch>=2.4.0
torchvision>=0.19.0
transformers>=4.45.0
huggingface-hub>=0.25.1
sentencepiece>=0.2.0
jinja2>=3.1.4
pydantic>=2.9.2
timm>=1.0.9
tiktoken>=0.7.0
numpy==1.26.4 # Need less than 2.0.0
accelerate>=0.34.0
sentence_transformers>=3.1.1
gradio>=4.44.1 # web demo
openai>=1.51.0 # openai demo
einops>=0.8.0
pillow>=10.4.0
sse-starlette>=2.1.3
bitsandbytes>=0.43.3 # INT4 Loading

vllm>=0.6.2 # using with VLLM Framework

flash-attn>=2.6.3 # using with flash-attention 2

PEFT model, not need if you don't use PEFT finetune model.

peft>=0.13.0 # Using with finetune model

Who can help? / 谁可以帮助到您?

No response

Information / 问题信息

  • The official example scripts / 官方的示例脚本
  • My own modified scripts / 我自己修改的脚本和任务

Reproduction / 复现过程

pip install -r requirements.txt

requirements.txt includes:
torch>=2.4.0
torchvision>=0.19.0
transformers>=4.45.0
huggingface-hub>=0.25.1
sentencepiece>=0.2.0
jinja2>=3.1.4
pydantic>=2.9.2
timm>=1.0.9
tiktoken>=0.7.0
numpy==1.26.4 # Need less than 2.0.0
accelerate>=0.34.0
sentence_transformers>=3.1.1
gradio>=4.44.1 # web demo
openai>=1.51.0 # openai demo
einops>=0.8.0
pillow>=10.4.0
sse-starlette>=2.1.3
bitsandbytes>=0.43.3 # INT4 Loading

vllm>=0.6.2 # using with VLLM Framework

flash-attn>=2.6.3 # using with flash-attention 2

PEFT model, not need if you don't use PEFT finetune model.

peft>=0.13.0 # Using with finetune model

Expected behavior / 期待表现

I hope I can install the dependencies for GLM-4V-9B.

@zRzRzRzRzRzRzR zRzRzRzRzRzRzR self-assigned this Oct 28, 2024
@zRzRzRzRzRzRzR
Copy link
Member

zRzRzRzRzRzRzR commented Oct 28, 2024

Please install according to the requirements.txt in the current basic demo.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants