Ignis-7B-DPO

Creator: NeuralNovel
Community Organization: ConvexAI
Discord: Join us on Discord
Ignis-7B-DPO Trained on the Neural-DPO dataset using A-100 80GB.
More Details:
Coming Soon
- Downloads last month
- 223
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.