The official base model weights for "Efficient Continual Pre-training by Mitigating the Stability Gap".

The model has been continually pretrained on a high-quality medical sub-corpus from the RefinedWeb dataset.

Downloads last month
4
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for YiDuo1999/Llama-3-Physician-8B-Base

Quantizations
1 model