Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat: add image for rayservice #7

Draft
wants to merge 1 commit into
base: main
Choose a base branch
from

Conversation

bincherry
Copy link

@bincherry bincherry commented Sep 6, 2024

@bincherry bincherry force-pushed the feat_ray branch 4 times, most recently from 6e7c054 to ac10bd5 Compare September 11, 2024 10:21
@bincherry bincherry force-pushed the feat_ray branch 7 times, most recently from 57a568b to 240f4a3 Compare October 15, 2024 02:16
@bincherry bincherry changed the title feat: add image for ray feat: add image for rayservice Oct 15, 2024
@bincherry
Copy link
Author

ray workspace相关的部分移动到单独的pr中:#16

@bincherry
Copy link
Author

目前包括了两个示例的镜像:

  • serve_config_examples

示例镜像 ghcr.io/bincherry/ray:text_ml

  • vLLM

示例镜像:

CPU版本 ghcr.io/bincherry/ray:2.35.0-py310-cpu-vllm-qwen_qwen2_0_5b_instruct
GPU版本(未验证)ghcr.io/bincherry/ray:2.35.0-py310-cu121-vllm-qwen_qwen2_0_5b_instruct

具体用法参考PR中的readme


USER ray

ADD --chown=ray:users https://raw.githubusercontent.com/ray-project/kuberay/refs/tags/v1.2.1/ray-operator/config/samples/vllm/serve.py /home/ray/serve.py

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

这个文件感觉还得重新写一个。他考虑的是下载的模式,vllm是运行时pip的,模型是从huggingface下载的,所以他只需要暴露model_id这个参数就可以了,调用的时候直接用model_id就行了。我们如果把model_id使用本地路径的话,调用的时候也需要使用一个路径,就很奇怪,可以把 served-model-name 参数暴露出来。

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

2 participants