mazzzystar/Queryable
Run OpenAI's CLIP and Apple's MobileCLIP model on iOS to search photos.
Queryable helps you quickly find specific photos on your iPhone by describing what you're looking for, rather than relying on predefined categories. You type a natural language query, like 'a brown dog sitting on a bench,' and the app searches your entire photo library to show you relevant images. This tool is for anyone who wants a more flexible and private way to search their personal photo collection on iOS.
2,924 stars. No commits in the last 6 months.
Use this if you want to find specific images in your iPhone's photo album using detailed text descriptions and prioritize your privacy by keeping all searches offline.
Not ideal if you need to search photos across multiple devices, collaborate on shared albums, or are looking for a cloud-based photo management solution.
Stars
2,924
Forks
450
Language
Swift
License
MIT
Category
Last pushed
Jan 04, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/mazzzystar/Queryable"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
unum-cloud/UForm
Pocket-Sized Multimodal AI for content understanding and generation across multilingual texts,...
rom1504/clip-retrieval
Easily compute clip embeddings and build a clip retrieval system with them
s-emanuilov/litepali
LitePali is a minimal, efficient implementation of ColPali for image retrieval and indexing,...
slavabarkov/tidy
Offline semantic Text-to-Image and Image-to-Image search on Android powered by quantized...
cloudera/CML_AMP_Image_Analysis
Build a semantic search application with deep learning models.