rkouye/es-clip-image-search

Sample implementation of natural language image search with OpenAI's CLIP and Elasticsearch or Opensearch.

29
/ 100
Experimental

This project helps anyone with a large collection of images to search through them using everyday language, rather than relying on exact keywords or manual tags. You provide a collection of images and descriptions, and it gives you a web interface where you can type a phrase like "a dog playing fetch in a park" to find relevant pictures, even if they aren't explicitly tagged with those words. This is ideal for image library managers, content creators, or anyone needing to quickly retrieve specific visuals from a vast archive.

No commits in the last 6 months.

Use this if you need to build a user-friendly way for non-technical users to find images in a large, un-categorized or loosely categorized image database using natural language descriptions.

Not ideal if you only have a small number of images, or if your images are already perfectly tagged and categorized for traditional keyword search.

image-library-management content-discovery visual-asset-management digital-archives creative-asset-search
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 9 / 25
Maturity 8 / 25
Community 12 / 25

How are scores calculated?

Stars

73

Forks

8

Language

Python

License

Last pushed

Sep 01, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/embeddings/rkouye/es-clip-image-search"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.