billpsomas/efficient-probing
This repo contains the official implementation of the ICLR 2026 paper "Attention, Please! Revisiting Attentive Probing Through the Lens of Efficiency"
This project helps machine learning researchers efficiently evaluate the performance of frozen pre-trained image recognition models on new datasets. It takes a pre-trained model and image data as input and produces a performance score along with visual attention maps. This is primarily for researchers or ML engineers who need to quickly assess how well a pre-trained vision model understands specific visual concepts without the lengthy process of full fine-tuning.
Use this if you need to quickly and efficiently evaluate the classification capability of a frozen pre-trained vision encoder on various image datasets and gain insights into its decision-making through interpretable attention maps.
Not ideal if you are looking to fully fine-tune a model for maximum performance on a specific task or if your primary goal is to train a model from scratch.
Stars
29
Forks
—
Language
Python
License
Apache-2.0
Category
Last pushed
Feb 23, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/billpsomas/efficient-probing"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
philipperemy/keras-attention
Keras Attention Layer (Luong and Bahdanau scores).
tatp22/linformer-pytorch
My take on a practical implementation of Linformer for Pytorch.
ematvey/hierarchical-attention-networks
Document classification with Hierarchical Attention Networks in TensorFlow. WARNING: project is...
datalogue/keras-attention
Visualizing RNNs using the attention mechanism
thushv89/attention_keras
Keras Layer implementation of Attention for Sequential models