This is the 110M parameter Llama 2 architecture model trained on the TinyStories dataset. These are converted from karpathy/tinyllamas. See the llama2.c project for more details.

Downloads last month
844
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for nickypro/tinyllama-110M

Finetunes
1 model
Quantizations
2 models