Hallucination
#2
by
Farouk09
- opened
Hi @jpacifico , thanks for your reply! I served the model using vLLM with dtype="auto" (which uses FP16 precision for FP32/FP16 models and BF16 for BF16 models), and a temperature setting of 0.3.
The system prompt instructed the LLM to act as "un tuteur pour orienter les étudiants à choisir une formation adéquate."