PicoNosenso
Collection
Where "Accuracy" Takes a Cosmic Vacation
•
5 items
•
Updated
A deliberately unpredictable 7.59M-parameter micro-model trained on minimalist data. Specializes in generating creatively liberated outputs that blend geography, history, and hallucinatory fiction. Not designed for factual accuracy - consider it a Dadaist art piece in model form.
cc-by-nc-4.0
from transformers import GPT2LMHeadModel, AutoTokenizer
model = GPT2LMHeadModel.from_pretrained('Lominub44/PicoNosenso-v1')
tokenizer = AutoTokenizer.from_pretrained('Lominub44/PicoNosenso-v1')
input_text = "<|startoftext|>Question: What is the capital of France?\nAnswer:"
inputs = tokenizer(input_text, return_tensors='pt')
outputs = model.generate(**inputs,
max_length=256,
temperature=0.4, # Recommended
repetition_penalty=1.2,
do_sample=True)
print(tokenizer.decode(outputs[0]))
BibTeX:
@misc{PicoNosenso,
author = {Lominub44},
title = {{PicoNosenso-v1: Where Accuracy Takes a Cosmic Vacation}},
year = {2025},
publisher = {Hugging Face},
howpublished = {\url{https://huggingface.co/Lominub44/PicoNosenso-v1}}
}
@misc{alpaca,
author = {Rohan Taori and Ishaan Gulrajani and Tianyi Zhang and Yann Dubois and Xuechen Li and Carlos Guestrin and Percy Liang and Tatsunori B. Hashimoto },
title = {Stanford Alpaca: An Instruction-following LLaMA model},
year = {2023},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/tatsu-lab/stanford_alpaca}},
}
@misc{no_robots,
author = {Nazneen Rajani and Lewis Tunstall and Edward Beeching and Nathan Lambert and Alexander M. Rush and Thomas Wolf},
title = {No Robots},
year = {2023},
publisher = {Hugging Face},
journal = {Hugging Face repository},
howpublished = {\url{https://huggingface.co/datasets/HuggingFaceH4/no_robots}}
}
Lominub44