D0miH/does-clip-know-my-face
Source Code for the JAIR Paper "Does CLIP Know my Face?" (Demo: https://huggingface.co/spaces/AIML-TUDA/does-clip-know-my-face)
This project helps privacy advocates and legal professionals assess if an individual's face has been used in the training data of large vision-language AI models like CLIP, without explicit consent. By feeding multiple images of a person into the model and observing its recognition capabilities across various text labels, it reveals whether that person's identity was part of the training set. This tool is for those concerned about data privacy and the unauthorized use of personal images in AI model development.
No commits in the last 6 months.
Use this if you need to determine with high accuracy whether a specific person's images were included in the training data of a large multimodal AI model, particularly for privacy audits or legal enforcement.
Not ideal if you are looking to identify individuals within a large, unlabeled dataset or perform general facial recognition tasks.
Stars
16
Forks
—
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Jul 09, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/D0miH/does-clip-know-my-face"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
mlfoundations/open_clip
An open source implementation of CLIP.
noxdafox/clipspy
Python CFFI bindings for the 'C' Language Integrated Production System CLIPS
openai/CLIP
CLIP (Contrastive Language-Image Pretraining), Predict the most relevant text snippet given an image
moein-shariatnia/OpenAI-CLIP
Simple implementation of OpenAI CLIP model in PyTorch.
BioMedIA-MBZUAI/FetalCLIP
Official repository of FetalCLIP: A Visual-Language Foundation Model for Fetal Ultrasound Image Analysis