Add missing `torch_dtype` in commented snippet in `README.md`
#8
by
alvarobartt
HF Staff
- opened
Without setting the torch_dtype
the snippet will fail with RuntimeError: FlashAttention only support fp16 and bf16 data type
, as it defaults to torch.float32
.