My models with citing
Collection
A few of the models
β’
17 items
β’
Updated
Codette is an advanced AI assistant designed to support users across cognitive, creative, and analytical tasks.
This model merges Falcon-40B and Mistral-7B to deliver high performance in text generation, medical diagnostics, and code reasoning.
Raiff1982/coredata
, Raiff1982/pineco
) Raiff1982/coredata
: medical and reasoning-focused samples Raiff1982/pineco
: mixed domain creative + technical promptsfrom transformers import AutoModelForCausalLM, AutoTokenizer
model_name = "Raiff1982/Codette"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name, trust_remote_code=True)
prompt = "How can AI improve medical diagnostics?"
inputs = tokenizer(prompt, return_tensors="pt")
output = model.generate(**inputs, max_length=200)
print(tokenizer.decode(output[0], skip_special_tokens=True))
Base model
mistralai/Mistral-7B-v0.3