MaxMini-Instruct-248M

Overview

MaxMini-Instruct-248M is a T5 (Text-To-Text Transfer Transformer) model Instruct fine-tuned on a variety of tasks. This model is designed to perform a range of instruction tasks.

Model Details

  • Model Name: MaxMini-Instruct-248M
  • Model Type: T5 (Text-To-Text Transfer Transformer)
  • Model Size: 248M parameters
  • Instruction Tuning

Usage

Installation

You can install the model via the Hugging Face library:

pip install transformers
pip install torch

Inference

# Load model directly
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

tokenizer = AutoTokenizer.from_pretrained("suriya7/MaxMini-Instruct-248M")
model = AutoModelForSeq2SeqLM.from_pretrained("suriya7/MaxMini-Instruct-248M")

my_question = "what is depression?"
inputs = "Please answer to this question: " + my_question

inputs = tokenizer(inputs, return_tensors="pt"     
                      )

generated_ids = model.generate(**inputs, max_new_tokens=250,do_sample=True)
decoded = tokenizer.decode(generated_ids[0], skip_special_tokens=True)
print(f"Generated Output: {decoded}")
Downloads last month
112
Safetensors
Model size
248M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for suriya7/MaxMini-Instruct-248M

Quantizations
2 models

Datasets used to train suriya7/MaxMini-Instruct-248M

Space using suriya7/MaxMini-Instruct-248M 1