Skip to content

[libshortfin] Initial implementation of LLM inference server. #350

[libshortfin] Initial implementation of LLM inference server.

[libshortfin] Initial implementation of LLM inference server. #350

Triggered via pull request September 24, 2024 22:06
Status Success
Total duration 5m 9s
Artifacts
Setup Python ASan
3s
Setup Python ASan
Build and test libshortfin
4m 44s
Build and test libshortfin
Fit to window
Zoom out
Zoom in

Annotations

1 warning
Build and test libshortfin
Cache save failed.