New hxa07D family of hybrid models, combining improved RWKV recurrent architectures with Transformer-based attention.
Designed for efficient long-cont
OpenMOSE
OpenMOSE
AI & ML interests
Can love be expressed as a tensor?
Recent Activity
updated
a model
about 24 hours ago
OpenMOSE/RWKV-24B-A2B-wakaba-2601
updated
a collection
2 days ago
hxa07D RWKV-Transformer Hybrid series
updated
a collection
2 days ago
hxa07D RWKV-Transformer Hybrid series
Organizations
None yet