This is a unsloth/Phi-3-mini-4k-instruct model, fine-tuned on b-mc2/sql-create-context, Clinton/Text-to-sql-v1 and knowrohit07/know_sql dataset.

Model Usage

Use the unsloth library to laod and run the model.

Install unsloth and other dependencies.

# Installs Unsloth, Xformers (Flash Attention) and all other packages!
!pip install "unsloth[colab-new] @ git+https://github.com/unslothai/unsloth.git"
!pip install --no-deps xformers "trl<0.9.0" peft accelerate bitsandbytes torch

Use FastLanguageModel to download and laod the model from hf hub.

from unsloth import FastLanguageModel
model, tokenizer = FastLanguageModel.from_pretrained(
    model_name = "dmedhi/Phi-3-mini-4k-instruct-text2SQL",
    max_seq_length = 2048
    dtype = None
    load_in_4bit = True
)
FastLanguageModel.for_inference(model)

prompt = """Below is a question that describes a SQL function, paired with a table Context that provides SQL table context. Write an answer that fullfils the user query.

### Question:
{}

### Context:
{}

### Answer:
{}"""

inputs = tokenizer(
[
    prompt.format(
        "What is the latest year that has ferrari 166 fl as the winning constructor?",
        """CREATE TABLE table_name_7 (
            year INTEGER,
            winning_constructor VARCHAR
        )""",
        ""
    )
], return_tensors = "pt").to("cuda")

outputs = model.generate(**inputs, max_new_tokens = 64, use_cache = True)
tokenizer.batch_decode(outputs)
# ["<s> Below is a question that describes a SQL function, paired with a table Context that provides SQL table context. Write an answer that fullfils the user query.\n\n### Question:\nWhat is the latest year that has ferrari 166 fl as the winning constructor?\n\n### Context:\nCREATE TABLE table_name_7 (\n    year INTEGER,\n    winning_constructor VARCHAR\n)\n\n### Answer:\nTo find the latest year that Ferrari 166 FL was the winning constructor, you can use the following SQL query:\n\n```sql\nSELECT MAX(year)\nFROM table_name_7\nWHERE winning_constructor = 'Ferrari 166 FL';\n```\n"]
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.
The model cannot be deployed to the HF Inference API: The model has no pipeline_tag.

Model tree for dmedhi/Phi-3-mini-4k-instruct-text2SQL

Finetuned
(42)
this model

Datasets used to train dmedhi/Phi-3-mini-4k-instruct-text2SQL