Spaces:
Running
Running
| title: Bioclip Demo | |
| emoji: π | |
| colorFrom: indigo | |
| colorTo: purple | |
| sdk: gradio | |
| sdk_version: 5.33.0 | |
| app_file: app.py | |
| pinned: false | |
| license: mit | |
| description: >- | |
| This spaces provides an interactive demo for running BioCLIP inference. Additionally, one can run inference on multiple images using the pybioclip package. | |
| tags: | |
| - visualization | |
| - data | |
| - samples | |
| - data-visualization | |
| - exploration | |
| - biology | |
| - vision | |
| - zero-shot-image-classification | |
| - clip | |
| - CV | |
| - images | |
| - animals | |
| - species | |
| - taxonomy | |
| - rare species | |
| - endangered species | |
| - evolutionary biology | |
| - multimodal | |
| - knowledge-guided | |
| # BioCLIP Demo | |
| A demo implementation of the [BioCLIP model](https://huggingface.co/imageomics/bioclip). |