File size: 765 Bytes
44e9845 f72121c f3b99fb f72121c d14fd4d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 |
---
tags:
- kernel
---
# Activation
Activation is a python package that contains custom CUDA-based activation kernels, primarily targeting AMD GPUs.
- Currently implemented
- [PolyNorm](https://arxiv.org/html/2411.03884v1)
- [RMSNorm](https://docs.pytorch.org/docs/stable/generated/torch.nn.RMSNorm.html)
## Usage
```python
import torch
from kernels import get_kernel
activation = get_kernel("motif-technologies/activation")
torch.set_default_device("cuda")
poly_norm = activation.layers.PolyNorm(eps=1e-6)
x = torch.randn(10, 10)
print(poly_norm(x))
```
## Performance
### PolyNorm
- Test cases are from the Motif LLM
- You can reproduce the results with:
```bash
cd tests
pytest --run-perf --do-plot
```

|