amazon-science/transformers-data-augmentation
Code associated with the "Data Augmentation using Pre-trained Transformer Models" paper
This project offers methods to expand small text datasets for natural language processing tasks. It takes an existing, limited text dataset and generates additional, varied examples, which can then be used to train more robust NLP models. Researchers and machine learning engineers working on text classification or natural language understanding with scarce data would find this useful.
No commits in the last 6 months.
Use this if you need to improve the performance of a text-based machine learning model but are limited by a small amount of training data.
Not ideal if you already have large, high-quality text datasets or if your task doesn't involve text data.
Stars
51
Forks
7
Language
Python
License
—
Category
Last pushed
Jun 12, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/amazon-science/transformers-data-augmentation"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
huggingface/transformers
🤗 Transformers: the model-definition framework for state-of-the-art machine learning models in...
kyegomez/LongNet
Implementation of plug in and play Attention from "LongNet: Scaling Transformers to 1,000,000,000 Tokens"
pbloem/former
Simple transformer implementation from scratch in pytorch. (archival, latest version on codeberg)
NVIDIA/FasterTransformer
Transformer related optimization, including BERT, GPT
kyegomez/SimplifiedTransformers
SimplifiedTransformer simplifies transformer block without affecting training. Skip connections,...