Med-Llama-3.1-8B-DeepSeek-Distilled

Model Overview

This model was fine-tuned from enesarda22/Llama-3.1-8B-DeepSeek67B-Distilled using a medical corpus.

Evaluation Scores

Usage

Load the model and tokenizer from the Hugging Face Hub:

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("enesarda22/Med-Llama-3.1-8B-DeepSeek67B-Distilled")
tokenizer = AutoTokenizer.from_pretrained("enesarda22/Med-Llama-3.1-8B-DeepSeek67B-Distilled")
Downloads last month
3
Safetensors
Model size
8.03B params
Tensor type
BF16
·
Inference Providers NEW
This model isn't deployed by any Inference Provider. 🙋 Ask for provider support

Model tree for enesarda22/Med-Llama-3.1-8B-DeepSeek67B-Distilled

Quantizations
2 models

Datasets used to train enesarda22/Med-Llama-3.1-8B-DeepSeek67B-Distilled