arampacha/rsicd
Viewer • Updated • 10.9k • 507 • 12
How to use arampacha/clip-test with Transformers:
# Use a pipeline as a high-level helper
from transformers import pipeline
pipe = pipeline("zero-shot-image-classification", model="arampacha/clip-test")
pipe(
"https://huggingface.co/datasets/huggingface/documentation-images/resolve/main/hub/parrots.png",
candidate_labels=["animals", "humans", "landscape"],
) # Load model directly
from transformers import AutoProcessor, AutoModelForZeroShotImageClassification
processor = AutoProcessor.from_pretrained("arampacha/clip-test")
model = AutoModelForZeroShotImageClassification.from_pretrained("arampacha/clip-test")This model is a fine-tuned version of openai/clip-vit-base-patch32 on the arampacha/rsicd dataset. It achieves the following results on the evaluation set:
More information needed
More information needed
More information needed
The following hyperparameters were used during training: