Models

Grounded SAM vs. SAM-CLIP

Both

Grounded SAM

and

SAM-CLIP

are commonly used in computer vision projects. Below, we compare and contrast

Grounded SAM

and

SAM-CLIP

.

  Grounded SAM SAM-CLIP
Date of Release
Model Type Instance Segmentation Instance Segmentation
Architecture
GitHub Stars

Compare Grounded SAM and SAM-CLIP with Autodistill

Using Autodistill, you can compare Grounded SAM and SAM-CLIP on your own images in a few lines of code.

Here is an example comparison:

To start a comparison, first install the required dependencies:


pip install autodistill autodistill-grounded-sam autodistill-sam-clip

Next, create a new Python file and add the following code:


from autodistill_grounded_sam import GroundedSAM
from autodistill_sam_clip import SAMCLIP

from autodistill.detection import CaptionOntology
from autodistill.utils import compare

ontology = CaptionOntology(
    {
        "solar panel": "solar panel",
    }
)

models = [
    GroundedSAM(ontology=ontology),
    SAMCLIP(ontology=ontology)
]

images = [
    "/home/user/autodistill/solarpanel1.jpg",
    "/home/user/autodistill/solarpanel2.jpg"
]

compare(
    models=models,
    images=images
)

Above, replace the images in the `images` directory with the images you want to use.

The images must be absolute paths.

Then, run the script.

You should see a model comparison like this:

When you have chosen a model that works best for your use case, you can auto label a folder of images using the following code:


base_model.label(
  input_folder="./images",
  output_folder="./dataset",
  extension=".jpg"
)

Compare Grounded SAM to other models

Compare SAM-CLIP to other models

Deploy a computer vision model today

Join 250,000 developers curating high quality datasets and deploying better models with Roboflow.

Get started