🌱 mT5 Crop Recommendation (LoRA Fine-tuned)

This is a fine-tuned mT5 model using LoRA adapters for crop recommendation tasks.
It takes weather and environmental inputs and suggests the most suitable crop(s) along with profitability insights.

πŸ§‘β€πŸ« Model Details

  • Base Model: google/mt5-base
  • Fine-tuning Method: LoRA (Low-Rank Adaptation)
  • Framework: πŸ€— Transformers
  • Dataset: Custom crop recommendation dataset (weather, soil, profitability annotations)
  • Languages: English

πŸ“Š Training

  • Epochs: ~0.1–1 (early stop around loss 0.1)
  • Trainable Parameters: ~344K (LoRA only)
  • Total Parameters: ~300M
  • Hardware: A100 GPU (Colab)

πŸ”Ž Example Usage

from transformers import AutoTokenizer, AutoModelForSeq2SeqLM

model_name = "your-username/mt5-crop-lora"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)

query = "Suggest best crop given: rainfall=200mm, temperature=25C, soil=loamy"

inputs = tokenizer(query, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50)

print(tokenizer.decode(outputs[0], skip_special_tokens=True))

Sample Output:

Best crop: Gram.  
Top 3: Gram, Mustard, Wheat.  
Weather outlook: Cooler, dry weather with lower rainfall.  
Profitability: Moderate.

βœ… Use Cases

  • Agricultural planning
  • Crop advisory chatbots
  • Climate-aware farming assistance

⚠️ Limitations

  • Limited dataset β†’ may not generalize globally
  • Should not replace expert advice
  • Cross-check with local agronomic data

πŸ“œ License

Apache 2.0

πŸ™Œ Acknowledgements

Downloads last month
1
Safetensors
Model size
0.3B params
Tensor type
F32
Β·
Inference Providers NEW
This model isn't deployed by any Inference Provider. πŸ™‹ Ask for provider support

Space using Aaayushiii/mt5-crop-lora 1