paged-attention / README.md
danieldk's picture
danieldk HF Staff
Port vLLM attention kernels
132e594
|
raw
history blame
121 Bytes
---
license: apache-2.0
---
## attention
Paged attention kernels from [vLLM](https://github.com/vllm-project/).