vector-db-benchmark and Vector-Arena
These are competitors offering overlapping benchmarking frameworks for vector databases, with Qdrant's tool being more established (353 stars vs 1) and Vector-Arena attempting to differentiate through multiprocessing isolation and more granular latency metrics (diverse, sequential, filtered, bulk search).
About vector-db-benchmark
qdrant/vector-db-benchmark
Framework for benchmarking vector search engines
This framework helps you compare the speed and efficiency of different vector search engines. It takes a vector search engine, a dataset (like embeddings for text or images), and a defined test scenario as input. It then measures how well and how quickly the engine performs, providing results to help you choose the best one for your specific needs. This is ideal for machine learning engineers, MLOps specialists, or anyone building or deploying applications that rely on fast and accurate vector search.
About Vector-Arena
M4iKZ/Vector-Arena
A comprehensive, multiprocessing-isolated benchmark for evaluating vector database performance and quality. Measures insertion speed, search latency (diverse, sequential, filtered, and bulk), recall accuracy, and memory usage across standard (ChromaDB, LanceDB, Qdrant, FAISS, USearch) and custom engine implementations.
Related comparisons
Scores updated daily from GitHub, PyPI, and npm data. How scores work