MobileSAM model

This model repository contains the weights of the MobileSAM model.

Installation

First install the MobileSAM package:

git clone -b add_mixin https://github.com/NielsRogge/MobileSAM.git
cd MobileSAM

Usage

The model can then be used as follows:

from mobile_sam import MobileSAM, SamPredictor
import torch

model = MobileSAM.from_pretrained("nielsr/mobilesam")

# perform inference
device = "cuda" if torch.cuda.is_available() else "cpu"
model.to(device=device)

predictor = SamPredictor(model)
predictor.set_image(<your_image>)
masks, _, _ = predictor.predict(<input_prompts>)
Downloads last month
5
Safetensors
Model size
10.1M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the HF Inference API does not support transformers models with pipeline type mask-generation