Add missing `torch_dtype` in commented snippet in `README.md`

#8
by alvarobartt HF Staff - opened

Without setting the torch_dtype the snippet will fail with RuntimeError: FlashAttention only support fp16 and bf16 data type, as it defaults to torch.float32.

Ready to merge
This branch is ready to get merged automatically.

Sign up or log in to comment