You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

Model Card for PathOrchestra_V1.0.0

Requesting Access

You need to request access through your official email in your Hugging Face account.

Model Description

PathOrchestra, a versatile pathology foundation model trained via self-supervised learning on a dataset comprising 300,000 pathological slides (262.5 TB) from 20 tissue and organ types across multiple centers. The model was rigorously evaluated on 112 clinical tasks using a combination of 61 private and 51 public datasets.

How To Use as a vision encoder

import timm
from timm.data import resolve_data_config
from timm.data.transforms_factory import create_transform
from huggingface_hub import login

login()  # login with your User Access Token, found at https://huggingface.co/settings/tokens

# pretrained=True needed to load PathOrchestra_v1.0 weights 
model = timm.create_model("hf-hub:yf-research/PathOrchestra_V1.0.0.0", pretrained=True, init_values=1e-5, dynamic_img_size=True)
transform = create_transform(**resolve_data_config(model.pretrained_cfg, model=model))
model.eval()

You can use the pretrained encoder to extract features from pathology patches, as follows:

from PIL import Image
from torchvision import transforms

transform = transforms.Compose(
    [
        transforms.Resize(224),
        transforms.ToTensor(),
        transforms.Normalize(mean=(0.485, 0.456, 0.406), std=(0.229, 0.224, 0.225)),
    ]
)

image = Image.open("example.png")
image = transform(image).unsqueeze(dim=0) 

with torch.inference_mode():
    feature_emb = model(image) 

Contact

For any additional questions or comments, contact the corresponding authors.

Acknowledgements

Thanks to DINOv2 and UNI.

Downloads last month
0
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support