fguzman82/CLIP-Finder2
CLIP-Finder enables semantic offline searches of images from gallery photos using natural language descriptions or the camera. Built on Apple's MobileCLIP-S0 architecture, it ensures optimal performance and accurate media retrieval.
CLIP-Finder helps you quickly find specific images within your iPhone's photo gallery without an internet connection. You can search by describing what you're looking for with words, or by using your camera to capture an object or scene you want to match. This tool is for anyone who has a large collection of photos on their iPhone and needs an easy, private way to locate particular images.
No commits in the last 6 months.
Use this if you frequently struggle to find old photos in your iPhone's gallery and prefer a natural, offline search method.
Not ideal if you need to search for images across cloud storage, other devices, or require advanced photo editing features.
Stars
90
Forks
11
Language
Swift
License
MIT
Category
Last pushed
Jul 25, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/fguzman82/CLIP-Finder2"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
unum-cloud/UForm
Pocket-Sized Multimodal AI for content understanding and generation across multilingual texts,...
rom1504/clip-retrieval
Easily compute clip embeddings and build a clip retrieval system with them
mazzzystar/Queryable
Run OpenAI's CLIP and Apple's MobileCLIP model on iOS to search photos.
s-emanuilov/litepali
LitePali is a minimal, efficient implementation of ColPali for image retrieval and indexing,...
slavabarkov/tidy
Offline semantic Text-to-Image and Image-to-Image search on Android powered by quantized...