πŸ” Model Card for mini-recurrence-converter-dsl-adapter

🧠 Model Details

This model is a parameter-efficient fine-tuning (PEFT) LoRA adapter, designed as a demonstration and testbed for the Mini Recurrence Converter DSL module.

It is provided for demonstration and experimentation purposes only.

It translates English recurrence expressions, such as "every Monday at 10am" or "on the last Friday of each month", into structured function calls using that custom DSL.

  • Model type: LoRA adapter (PEFT)
  • Language(s): English
  • License: MIT
  • Finetuned from model: microsoft/Phi-4-mini-instruct

πŸš€ Uses

This adapter fine-tunes Phi-4-mini-instruct for parsing natural recurrence expressions into DSL format. It supports the DSL functions defined in the Mini Recurrence Converter DSL module.

Example prompt:

$ You are a precise parser of recurring schedule expressions. Your only job is to translate natural language recurrence expressions into structured DSL function calls such as WEEKLY(...) or MONTHLY_BY_WEEKDAY(...). Do not explain or elaborate. Only return the code.
> every second Tuesday of the month at 1pm  
< MONTHLY_BY_WEEKDAY(1, TU, 2, TIME(13, 0))

⚠️ Out-of-Scope Use

This adapter has been fine-tuned specifically as a demonstration and testbed for the Mini Recurrence Converter DSL module. It is not intended for general-purpose dialogue or unrelated tasks.

πŸ—οΈ Training Details

Trained on a6188466/mini-recurrence-converter-dsl-dataset using the dsl adapter from fifo-tool-datasets and fine_tune.py.

  • Dataset: 279 examples mapping English recurrence expressions to DSL commands, including hand-curated and synthetic samples
  • Epochs: 15
  • Batch size: 1
  • Precision: bf16
  • Framework: transformers, peft, trl (SFTTrainer)

βš™οΈ Training Hyperparameters

{
  "num_train_epochs": 15,
  "train_batch_size": 1,
  "learning_rate": 5e-06,
  "lr_scheduler_type": "cosine",
  "warmup_ratio": 0.2,
  "bf16": true,
  "seed": 0
}

πŸ“ˆ Training Results

{
  "mean_token_accuracy": 0.9538259625434875,
  "total_flos": 6218234638387200.0,
  "train_loss": 0.4653199369477257,
  "train_runtime": 932.994,
  "train_samples_per_second": 4.486,
  "train_steps_per_second": 4.486,
  "final_learning_rate": 2.751554198876516e-11
}

βœ… Evaluation

  • Eval set: Natural language queries similar in structure and intent to the training examples
  • Metric: Functional equivalence β€” two DSL expressions are considered correct if they evaluate to the same result
  • Results:
    • 97.79% on held-out test set (221/226 passed), containing queries similar in structure and intent to the training examples, using the functional equivalence metric defined above. The test set includes 26 hand-curated and 200 synthetic examples.

These results demonstrate acceptable performance on evaluated queries. However, as this is a demonstration adapter trained on a narrow dataset, performance should be carefully and independently evaluated in each individual use case.

Evaluation script: evaluate_mini_recurrence_converter_dsl_model.py

⚠️ Disclaimer & Limitations

This adapter is intended solely as a demonstration and testbed for the Mini Recurrence Converter DSL module. It should not be used beyond that scope.

It does not cover all phrasings or edge cases of English recurrence expressions. Instead, it focuses on illustrative examples that are supported by the module's DSL functions.

This adapter is provided as is, without warranties or guarantees of any kind. It is intended for demonstration and experimentation only.

πŸͺͺ License

MIT License. See LICENSE for details.

πŸ“¬ Contact

For questions, feedback, or bug reports, please open an issue on GitHub or start a discussion on the Hugging Face Hub.

Downloads last month
6
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Model tree for a6188466/mini-recurrence-converter-dsl-adapter

Adapter
(57)
this model

Dataset used to train a6188466/mini-recurrence-converter-dsl-adapter