No items found.
Use the widget below to experiment with MetaCLIP. You can detect COCO classes such as people, vehicles, animals, household items.
MetaCLIP is a zero-shot classification and embedding model developed by Meta AI.
MetaCLIP
is licensed under a
CC BY-NC 4.0
license.
You can use Roboflow Inference to deploy a
MetaCLIP
API on your hardware. You can deploy the model on CPU (i.e. Raspberry Pi, AI PCs) and GPU devices (i.e. NVIDIA Jetson, NVIDIA T4).
Below are instructions on how to deploy your own model API.
First, install Autodistill and Autodistill MetaCLP:
pip install autodistill autodistill-metaclip
Then, run:
from autodistill_metaclip import MetaCLIP
# define an ontology to map class names to our MetaCLIP prompt
# the ontology dictionary has the format {caption: class}
# where caption is the prompt sent to the base model, and class is the label that will
# be saved for that caption in the generated annotations
# then, load the model
base_model = MetaCLIP(
ontology=CaptionOntology(
{
"person": "person",
"a forklift": "forklift"
}
)
)
results = base_model.predict("./image.png")
print(results)