Queryable and CLIP-Finder2
Both tools are independent implementations of semantic image search on iOS using CLIP-family models (OpenAI's CLIP/Apple's MobileCLIP), making them direct competitors offering similar functionality with different model choices and user interfaces rather than complementary components.
About Queryable
mazzzystar/Queryable
Run OpenAI's CLIP and Apple's MobileCLIP model on iOS to search photos.
Queryable helps you quickly find specific photos on your iPhone by describing what you're looking for, rather than relying on predefined categories. You type a natural language query, like 'a brown dog sitting on a bench,' and the app searches your entire photo library to show you relevant images. This tool is for anyone who wants a more flexible and private way to search their personal photo collection on iOS.
About CLIP-Finder2
fguzman82/CLIP-Finder2
CLIP-Finder enables semantic offline searches of images from gallery photos using natural language descriptions or the camera. Built on Apple's MobileCLIP-S0 architecture, it ensures optimal performance and accurate media retrieval.
CLIP-Finder helps you quickly find specific images within your iPhone's photo gallery without an internet connection. You can search by describing what you're looking for with words, or by using your camera to capture an object or scene you want to match. This tool is for anyone who has a large collection of photos on their iPhone and needs an easy, private way to locate particular images.
Scores updated daily from GitHub, PyPI, and npm data. How scores work