Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

H100/H200/B200 FlashAttention3 for Flux + TorchAO improvements #1033

Merged
merged 17 commits into from
Oct 8, 2024

quantisation should provide error during OOM to use --quantize_via=cpu

ee758c4
Select commit
Loading
Failed to load commit list.
Merged

H100/H200/B200 FlashAttention3 for Flux + TorchAO improvements #1033

quantisation should provide error during OOM to use --quantize_via=cpu
ee758c4
Select commit
Loading
Failed to load commit list.

Select a check to view from the sidebar