shizhouxing/Robustness-Verification-for-Transformers
[ICLR 2020] Code for paper "Robustness Verification for Transformers"
This tool helps machine learning engineers ensure the reliability of their Transformer models used for text classification. It takes a trained Transformer model and text datasets (like Yelp or SST-2 reviews) as input. The output specifies how robust the model is to small, imperceptible changes in the input text, helping engineers understand potential vulnerabilities.
No commits in the last 6 months.
Use this if you are a machine learning engineer working with Transformer models for natural language processing and need to formally verify their robustness against small input perturbations.
Not ideal if you need to verify the robustness of machine learning models other than Transformers or require a more general robustness verification framework.
Stars
27
Forks
3
Language
Python
License
BSD-2-Clause
Category
Last pushed
Nov 26, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/shizhouxing/Robustness-Verification-for-Transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
SamsungSAILMontreal/nino
Code for "Accelerating Training with Neuron Interaction and Nowcasting Networks" [ICLR 2025]
graphdeeplearning/graphtransformer
Graph Transformer Architecture. Source code for "A Generalization of Transformer Networks to...
vijaydwivedi75/gnn-lspe
Source code for GNN-LSPE (Graph Neural Networks with Learnable Structural and Positional...
snap-stanford/relgt
Relational Graph Transformer
omron-sinicx/crystalframer
The official code respository for "Rethinking the role of frames for SE(3)-invariant crystal...