You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I searched the Codefuse documentation with the integrated search.
I used the GitHub search to find a similar question and didn't find it.
I am sure that this is a bug in Codefuse-Repos rather than my code.
I added a very descriptive title to this issue.
System Info
Windows
Code Version
Latest Release
Description
i already changed VLLM_MODEL_DICT in model_config.py and downloaded chatglm-6b in llm_models folder. Docker image has some dependencies error, so i used local services. Even though i modified the code as wrote in fastchat.md but i can not connect with local llm, also do not record a log at llm_api.log and sdfile_api.log .
Example Code
#thats how i changed VLLM_MODEL_DICT
VLLM_MODEL_DICT = VLLM_MODEL_DICT or {
'chatglm2-6b': "chatglm-6b",
}
Error Message and Stack Trace (if applicable)
No response
The text was updated successfully, but these errors were encountered:
Checked other resources
System Info
Windows
Code Version
Latest Release
Description
i already changed VLLM_MODEL_DICT in model_config.py and downloaded chatglm-6b in llm_models folder. Docker image has some dependencies error, so i used local services. Even though i modified the code as wrote in fastchat.md but i can not connect with local llm, also do not record a log at llm_api.log and sdfile_api.log .
Example Code
#thats how i changed VLLM_MODEL_DICT
VLLM_MODEL_DICT = VLLM_MODEL_DICT or {
'chatglm2-6b': "chatglm-6b",
}
Error Message and Stack Trace (if applicable)
No response
The text was updated successfully, but these errors were encountered: