π± mT5 Crop Recommendation (LoRA Fine-tuned)
This is a fine-tuned mT5 model using LoRA adapters for crop recommendation tasks.
It takes weather and environmental inputs and suggests the most suitable crop(s) along with profitability insights.
π§βπ« Model Details
- Base Model:
google/mt5-base - Fine-tuning Method: LoRA (Low-Rank Adaptation)
- Framework: π€ Transformers
- Dataset: Custom crop recommendation dataset (weather, soil, profitability annotations)
- Languages: English
π Training
- Epochs: ~0.1β1 (early stop around loss 0.1)
- Trainable Parameters: ~344K (LoRA only)
- Total Parameters: ~300M
- Hardware: A100 GPU (Colab)
π Example Usage
from transformers import AutoTokenizer, AutoModelForSeq2SeqLM
model_name = "your-username/mt5-crop-lora"
tokenizer = AutoTokenizer.from_pretrained(model_name)
model = AutoModelForSeq2SeqLM.from_pretrained(model_name)
query = "Suggest best crop given: rainfall=200mm, temperature=25C, soil=loamy"
inputs = tokenizer(query, return_tensors="pt")
outputs = model.generate(**inputs, max_new_tokens=50)
print(tokenizer.decode(outputs[0], skip_special_tokens=True))
Sample Output:
Best crop: Gram.
Top 3: Gram, Mustard, Wheat.
Weather outlook: Cooler, dry weather with lower rainfall.
Profitability: Moderate.
β Use Cases
- Agricultural planning
- Crop advisory chatbots
- Climate-aware farming assistance
β οΈ Limitations
- Limited dataset β may not generalize globally
- Should not replace expert advice
- Cross-check with local agronomic data
π License
Apache 2.0
π Acknowledgements
- Google Research for mT5
- Hugging Face Transformers & PEFT
- Downloads last month
- 1
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
π
Ask for provider support