OPT-1.3B Fine-tuned with LLM Foundry FSDP

This model is a fine-tuned version of facebook/opt-1.3b, fine-tuned using MosaicML's LLM Foundry framework with PyTorch FSDP for production-ready distributed training with config-based setup.

This model was fine-tuned using the Dolly-15K instruction dataset on 2 ร— T4 16GB GPUs with built-in evaluation and experiment tracking.

For detailed implementation, configuration files, and training procedures, please check out the project repository.

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support