-
Notifications
You must be signed in to change notification settings - Fork 82
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support Bedrock Batch Inference #250
Comments
@bharven |
@3coins expected behavior would be to use the native bedrock batch inference capability instead of using sync API calls where possible. Bedrock batch currently requires >= 1000 examples in a job, ideally the AWS Documentation: https://docs.aws.amazon.com/bedrock/latest/userguide/batch-inference.html Boto3 SDK link: https://boto3.amazonaws.com/v1/documentation/api/latest/reference/services/bedrock/client/create_model_invocation_job.html |
@bharven
|
Add support for Bedrock Batch Inference when using BedrockLLM
batch()
instead of making calls with the sync APIThe text was updated successfully, but these errors were encountered: