Kirill-Kravtsov/drophead-pytorch
An implementation of drophead regularization for pytorch transformers
This project helps machine learning engineers improve the performance and generalization of their Transformer-based natural language processing models. By applying a regularization technique called DropHead, it takes an existing Hugging Face Transformer model (like BERT or RoBERTa) and outputs a more robust version that is less prone to overfitting, especially on smaller datasets. It's for machine learning engineers and researchers who are developing and fine-tuning NLP models for various tasks.
No commits in the last 6 months.
Use this if you are a machine learning engineer working with Hugging Face Transformer models and need to improve their stability and reduce overfitting during training.
Not ideal if you are not using PyTorch or Hugging Face Transformers, or if you need a pre-built solution that includes scheduled DropHead functionality out-of-the-box.
Stars
19
Forks
6
Language
Python
License
MIT
Category
Last pushed
Aug 24, 2021
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Kirill-Kravtsov/drophead-pytorch"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action