You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I'm getting the following error when installing the dependencies for GLM-4V-9B. What could be the reason?
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
torchaudio 2.2.2 requires torch==2.2.2, but you have torch 2.4.0 which is incompatible.
ai2-olmo 0.3.0 requires torch<2.3,>=2.0, but you have torch 2.4.0 which is incompatible.
cached-path 1.6.3 requires huggingface-hub<0.24.0,>=0.8.1, but you have huggingface-hub 0.26.1 which is incompatible.
This is the dependencies:
torch>=2.4.0
torchvision>=0.19.0
transformers>=4.45.0
huggingface-hub>=0.25.1
sentencepiece>=0.2.0
jinja2>=3.1.4
pydantic>=2.9.2
timm>=1.0.9
tiktoken>=0.7.0
numpy==1.26.4 # Need less than 2.0.0
accelerate>=0.34.0
sentence_transformers>=3.1.1
gradio>=4.44.1 # web demo
openai>=1.51.0 # openai demo
einops>=0.8.0
pillow>=10.4.0
sse-starlette>=2.1.3
bitsandbytes>=0.43.3 # INT4 Loading
vllm>=0.6.2 # using with VLLM Framework
flash-attn>=2.6.3 # using with flash-attention 2
PEFT model, not need if you don't use PEFT finetune model.
peft>=0.13.0 # Using with finetune model
Who can help? / 谁可以帮助到您?
No response
Information / 问题信息
The official example scripts / 官方的示例脚本
My own modified scripts / 我自己修改的脚本和任务
Reproduction / 复现过程
pip install -r requirements.txt
requirements.txt includes:
torch>=2.4.0
torchvision>=0.19.0
transformers>=4.45.0
huggingface-hub>=0.25.1
sentencepiece>=0.2.0
jinja2>=3.1.4
pydantic>=2.9.2
timm>=1.0.9
tiktoken>=0.7.0
numpy==1.26.4 # Need less than 2.0.0
accelerate>=0.34.0
sentence_transformers>=3.1.1
gradio>=4.44.1 # web demo
openai>=1.51.0 # openai demo
einops>=0.8.0
pillow>=10.4.0
sse-starlette>=2.1.3
bitsandbytes>=0.43.3 # INT4 Loading
vllm>=0.6.2 # using with VLLM Framework
flash-attn>=2.6.3 # using with flash-attention 2
PEFT model, not need if you don't use PEFT finetune model.
peft>=0.13.0 # Using with finetune model
Expected behavior / 期待表现
I hope I can install the dependencies for GLM-4V-9B.
The text was updated successfully, but these errors were encountered:
System Info / 系統信息
I'm getting the following error when installing the dependencies for GLM-4V-9B. What could be the reason?
ERROR: pip's dependency resolver does not currently take into account all the packages that are installed. This behaviour is the source of the following dependency conflicts.
torchaudio 2.2.2 requires torch==2.2.2, but you have torch 2.4.0 which is incompatible.
ai2-olmo 0.3.0 requires torch<2.3,>=2.0, but you have torch 2.4.0 which is incompatible.
cached-path 1.6.3 requires huggingface-hub<0.24.0,>=0.8.1, but you have huggingface-hub 0.26.1 which is incompatible.
This is the dependencies:
torch>=2.4.0
torchvision>=0.19.0
transformers>=4.45.0
huggingface-hub>=0.25.1
sentencepiece>=0.2.0
jinja2>=3.1.4
pydantic>=2.9.2
timm>=1.0.9
tiktoken>=0.7.0
numpy==1.26.4 # Need less than 2.0.0
accelerate>=0.34.0
sentence_transformers>=3.1.1
gradio>=4.44.1 # web demo
openai>=1.51.0 # openai demo
einops>=0.8.0
pillow>=10.4.0
sse-starlette>=2.1.3
bitsandbytes>=0.43.3 # INT4 Loading
vllm>=0.6.2 # using with VLLM Framework
flash-attn>=2.6.3 # using with flash-attention 2
PEFT model, not need if you don't use PEFT finetune model.
peft>=0.13.0 # Using with finetune model
Who can help? / 谁可以帮助到您?
No response
Information / 问题信息
Reproduction / 复现过程
pip install -r requirements.txt
requirements.txt includes:
torch>=2.4.0
torchvision>=0.19.0
transformers>=4.45.0
huggingface-hub>=0.25.1
sentencepiece>=0.2.0
jinja2>=3.1.4
pydantic>=2.9.2
timm>=1.0.9
tiktoken>=0.7.0
numpy==1.26.4 # Need less than 2.0.0
accelerate>=0.34.0
sentence_transformers>=3.1.1
gradio>=4.44.1 # web demo
openai>=1.51.0 # openai demo
einops>=0.8.0
pillow>=10.4.0
sse-starlette>=2.1.3
bitsandbytes>=0.43.3 # INT4 Loading
vllm>=0.6.2 # using with VLLM Framework
flash-attn>=2.6.3 # using with flash-attention 2
PEFT model, not need if you don't use PEFT finetune model.
peft>=0.13.0 # Using with finetune model
Expected behavior / 期待表现
I hope I can install the dependencies for GLM-4V-9B.
The text was updated successfully, but these errors were encountered: