tags: | |
- kernel | |
# Activation | |
Activation is a python package that contains custom CUDA-based activation kernels, primarily targeting AMD GPUs. | |
- Currently implemented | |
- [PolyNorm](https://arxiv.org/html/2411.03884v1) | |
- [RMSNorm](https://docs.pytorch.org/docs/stable/generated/torch.nn.RMSNorm.html) | |
## Usage | |
```python | |
import torch | |
from kernels import get_kernel | |
activation = get_kernel("motif-technologies/activation") | |
torch.set_default_device("cuda") | |
poly_norm = activation.layers.PolyNorm(eps=1e-6) | |
x = torch.randn(10, 10) | |
print(poly_norm(x)) | |
``` | |
## Performance | |
### PolyNorm | |
- Test cases are from the Motif LLM | |
- You can reproduce the results with: | |
```bash | |
cd tests | |
pytest --run-perf --do-plot | |
``` | |
 | |