Skip to content

vlm: increase the default max_num_batched_tokens for multimodal models #1458

vlm: increase the default max_num_batched_tokens for multimodal models

vlm: increase the default max_num_batched_tokens for multimodal models #1458

Annotations

1 warning

ruff (3.11)

succeeded Dec 24, 2024 in 8s