-
Notifications
You must be signed in to change notification settings - Fork 25
Commit
This commit does not belong to any branch on this repository, and may belong to a fork outside of the repository.
Add tensor parallelism to the paged llama model
This adds one test that checks the sharded vs the unsharded veriants. Make `sharktank.examples.paged_llm_v1` support a tensor parallelism CLI option. This change adds a lot of sharded variants for PyTorch API-equivalent ops but some of them lack auto-testing. index_copy_, index_put_, slicing, flatten, unflatten and reshape have tests. Check that replication and splitting of un unsharded tensor is not an actual copy. It is probably unintuitive that when ran through PyTorch the sharded result shares the same memory. It may be better to change the semantics and require that it is actually a copy. During exporting this would insert copies that the compiler would need to optimize out. Add test for sharded paged KV cache.
- Loading branch information
Showing
27 changed files
with
2,408 additions
and
151 deletions.
There are no files selected for viewing
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
Oops, something went wrong.