spore-sight-fungal-classifier

Fine-tuned Nucleotide Transformer for fungal spore classification.

Developed by: Angad28
Project: Smart India Hackathon 2025 - Spore Sight
Base Model: InstaDeepAI/nucleotide-transformer-v2-100m-multi-species

Quick Start

from transformers import AutoTokenizer, AutoModelForSequenceClassification
import torch

# Load model
tokenizer = AutoTokenizer.from_pretrained("Angad28/spore-sight-fungal-classifier")
model = AutoModelForSequenceClassification.from_pretrained("Angad28/spore-sight-fungal-classifier")

# Classify DNA sequence
sequence = "ATGCGTACGTACGTACGTACGTACGTACGTAC"
inputs = tokenizer(sequence, return_tensors="pt", truncation=True, max_length=512)

with torch.no_grad():
    outputs = model(**inputs)
    predictions = torch.nn.functional.softmax(outputs.logits, dim=-1)
    predicted_class = torch.argmax(predictions, dim=-1)
    confidence = predictions.max()

print(f"Predicted class: {predicted_class.item()}")
print(f"Confidence: {confidence.item():.3f}")

GPU Acceleration (RTX 4060 Compatible)

# GPU setup
device = torch.device("cuda" if torch.cuda.is_available() else "cpu")
model = model.to(device)

# Batch processing
sequences = ["ATGCGTAC...", "CGATCGAT...", "TACGATCG..."]
inputs = tokenizer(sequences, return_tensors="pt", truncation=True, 
                  max_length=512, padding=True).to(device)

with torch.no_grad():
    outputs = model(**inputs)
    predictions = torch.nn.functional.softmax(outputs.logits, dim=-1)

Classes

{ "2": "Alternaria", "15": "Candida", "92": "Mucor", "175": "Other" }

Node.js Backend Integration

This model works with the Spore Sight Node.js backend:

# Set environment variables in your .env file
HF_MODEL_ID='Angad28/spore-sight-fungal-classifier'
HF_TOKEN='your_hugging_face_token_here'

# Start backend
npm start

Performance

  • Hardware: RTX 4060 optimized
  • Batch Size: 8 sequences (recommended)
  • Max Length: 512 nucleotides
  • Inference Speed: ~100 sequences/second

Model Architecture

  • Base: Nucleotide Transformer v2 100M Multi-Species
  • Fine-tuning: LoRA (Low-Rank Adaptation)
  • Task: Multi-class fungal classification
  • Classes: 4 fungal genera

Installation & Usage

pip install transformers torch

# For GPU support
pip install torch torchvision torchaudio --index-url https://download.pytorch.org/whl/cu118

Citation

@misc{spore-sight-fungal-classifier,
  title={Spore Sight: Fine-tuned Nucleotide Transformer for Fungal Classification},
  author={Angad28},
  year={2025},
  publisher={Hugging Face},
  url={https://huggingface.co/Angad28/spore-sight-fungal-classifier}
}

License

MIT License


Ready for production with RTX 4060 GPU acceleration! ๐Ÿš€

Uploaded from local files on files

Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support