
SakanaAI/TinySwallow-1.5B
Text Generation
•
2B
•
Updated
•
29.9k
•
•
31
Compact Japanese models trained with "TAID: Temporally Adaptive Interpolated Distillation for Efficient Knowledge Transfer in Language Models"