Using Autodistill, you can compare CLIP and MetaCLIP on your own images in a few lines of code.
First, install the required dependencies:
pip install autodistill autodistill-clip autodistill-metaclip
Then, create a new Python file and add the following code:
from autodistill_metaclip import MetaCLIP
from autodistill_clip import CLIP
from autodistill.detection import CaptionOntology
classes = ["solar panel", "building"]
ontology = CaptionOntology(
{
"solar panel": "solar panel",
"building": "building"
}
)
metaclip_model = MetaCLIP(ontology=ontology)
clip_model = CLIP(ontology=ontology)
metaclip_results = metaclip_model.predict("./image.png", confidence=0)
clip_results = clip_model.predict("./image.png", confidence=0)
metaclip_classes = [classes[i] for i in metaclip_results.class_id.tolist()]
metaclip_scores = metaclip_results.confidence.tolist()
print("MetaCLIP Classes:", metaclip_classes)
print("MetaCLIP Scores:", metaclip_scores)
clip_classes = [classes[i] for i in clip_results.class_id.tolist()]
clip_scores = clip_results.confidence.tolist()
print("CLIP Classes:", clip_classes)
print("CLIP Scores:", clip_scores)
When you run the file, you will see an output that shows the results from the comparison:
MetaCLIP Classes: ['solar panel', 'building']
MetaCLIP Scores: [0.6350001096725464, 0.3649998903274536]
CLIP Classes: ['solar panel']
CLIP Scores: [0.6350001096725464]