Fixed Link issue.
Browse files
README.md
CHANGED
@@ -2,7 +2,7 @@
|
|
2 |
|
3 |
Hey there fellow researchers, developers, and AI enthusiasts! Today I'm releasing a new, biggest Pico-OpenLAiNN Model. This LLM was trained on the full 32B tokens that the entire Open-PicoLAiNN family is trained on.
|
4 |
|
5 |
-
You can find the GGUF quants of this model [here](https://huggingface.co/UUFO-Aigis/Pico-OpenLAiNN-500M).
|
6 |
## Models Overview
|
7 |
|
8 |
- **Pico-OpenLAiNN-100**: The smallest of the bunch, this 100M parameter model is perfect for quick experiments and applications where computational resources are *extremely* limited.
|
|
|
2 |
|
3 |
Hey there fellow researchers, developers, and AI enthusiasts! Today I'm releasing a new, biggest Pico-OpenLAiNN Model. This LLM was trained on the full 32B tokens that the entire Open-PicoLAiNN family is trained on.
|
4 |
|
5 |
+
You can find the GGUF quants of this model [here](https://huggingface.co/UUFO-Aigis/Pico-OpenLAiNN-500M-gguf).
|
6 |
## Models Overview
|
7 |
|
8 |
- **Pico-OpenLAiNN-100**: The smallest of the bunch, this 100M parameter model is perfect for quick experiments and applications where computational resources are *extremely* limited.
|