My DialoGPT Model

This is a fine-tuned version of microsoft/DialoGPT-small on custom data about Dominica.

Model Details

  • Model Name: DialoGPT-small
  • Training Data: Custom dataset about Dominica
  • Evaluation: Achieved eval_loss of 12.85

Usage

To use this model, you can load it as follows:

from transformers import AutoModelForCausalLM, AutoTokenizer

# Load the model and tokenizer
model_name = "unknownCode/IslandBoyRepo"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForCausalLM.from_pretrained(model_name)

# Generate a response
input_text = "What is the capital of Dominica?"
inputs = tokenizer(input_text, return_tensors="pt")
outputs = model.generate(**inputs)
response = tokenizer.decode(outputs[0], skip_special_tokens=True)

print(response)
Downloads last month
1
Safetensors
Model size
124M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.