-
Notifications
You must be signed in to change notification settings - Fork 783
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
feat(framework): unsloth #4949
feat(framework): unsloth #4949
Conversation
23335ae
to
0c7300d
Compare
359008a
to
364fd45
Compare
fef9faa
to
d66d828
Compare
Could you please also create a documentation similar to this section of Unsloth? https://docs.unsloth.ai/tutorials/how-to-finetune-llama-3-and-export-to-ollama#id-13.-exporting-to-ollama |
Should we submit this to unsloth? |
We should but let's create a draft to review first. |
3c5dd71
to
eb759c9
Compare
e98056d
to
725b4a9
Compare
28fc16e
to
a7c34de
Compare
4ccf0cf
to
9d05b6e
Compare
0ff17d7
to
f1554fe
Compare
70eb909
to
2dd95fc
Compare
Signed-off-by: Aaron Pham <[email protected]>
Signed-off-by: Aaron Pham <[email protected]> Signed-off-by: Frost Ming <[email protected]>
4ab168b
to
1c965e3
Compare
* feat: unsloth integrations Signed-off-by: Aaron Pham <[email protected]> * chore: update package export Signed-off-by: Aaron Pham <[email protected]> Signed-off-by: Frost Ming <[email protected]> --------- Signed-off-by: Aaron Pham <[email protected]> Signed-off-by: Frost Ming <[email protected]>
* feat: unsloth integrations Signed-off-by: Aaron Pham <[email protected]> * chore: update package export Signed-off-by: Aaron Pham <[email protected]> Signed-off-by: Frost Ming <[email protected]> --------- Signed-off-by: Aaron Pham <[email protected]> Signed-off-by: Frost Ming <[email protected]>
* feat: unsloth integrations Signed-off-by: Aaron Pham <[email protected]> * chore: update package export Signed-off-by: Aaron Pham <[email protected]> Signed-off-by: Frost Ming <[email protected]> --------- Signed-off-by: Aaron Pham <[email protected]> Signed-off-by: Frost Ming <[email protected]>
follow the https://colab.research.google.com/drive/1Ys44kVvmeZtnICzWz0xgpRnrIOjZAuxp?usp=sharing#scrollTo=QmUBVEnvCDJv
Then one can save with
bentoml.unsloth.build_bento
If model is continued from a fine-tune checkpoint, then model_name should be passed:
supports llama3, gemma, gemma2, mistral and most chat templates that is supported with fine-tune. Based on BentoVLLM and OpenLLM logics for template.