Skip to content

vlm: increase the default max_num_batched_tokens for multimodal models #1458

vlm: increase the default max_num_batched_tokens for multimodal models

vlm: increase the default max_num_batched_tokens for multimodal models #1458