Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Change the llama max_batch_size larger than default eval batch size #2283

Conversation

leslie-fang-intel
Copy link
Contributor

Fix the PyTorch issue: pytorch/pytorch#106110 for llama dynamic batch test case.
The root-cause and discussion is as described in pytorch/pytorch#106110 (comment).

@leslie-fang-intel
Copy link
Contributor Author

@ezyang @xuzhao9 could you kindly help to take a look?

@leslie-fang-intel
Copy link
Contributor Author

Hi @xuzhao9, could you help to take a look of this PR?

@facebook-github-bot
Copy link
Contributor

@xuzhao9 has imported this pull request. If you are a Meta employee, you can view this diff on Phabricator.

@facebook-github-bot
Copy link
Contributor

@xuzhao9 merged this pull request in a7ae1fb.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Projects
None yet
Development

Successfully merging this pull request may close these issues.

4 participants