--- title: Oxford Pet Classifier emoji: 💻 colorFrom: yellow colorTo: pink sdk: gradio sdk_version: 5.25.2 app_file: app.py pinned: false --- ### 📊 Zero-Shot Classification Results (CLIP) Evaluated using `openai/clip-vit-base-patch32` on 100 test samples from the Oxford-IIIT Pet dataset. - **Accuracy**: 76.00% - **Precision (macro)**: 81.56% - **Recall (macro)**: 76.16% The CLIP model was used without any fine-tuning, using label prompts like `"a photo of a [breed]"`. Check out the configuration reference at https://huggingface.co/docs/hub/spaces-config-reference