huggingface/naacl_transfer_learning_tutorial

Repository of code for the tutorial on Transfer Learning in NLP held at NAACL 2019 in Minneapolis, MN, USA

48
/ 100
Emerging

This project provides code to demonstrate 'transfer learning' techniques in Natural Language Processing (NLP). It helps NLP practitioners train models more effectively on specific tasks by leveraging knowledge from large, general datasets. You input text data for pre-training and fine-tuning, and it outputs a refined NLP model tailored for classification or other downstream tasks. It's intended for machine learning engineers and researchers working with text.

722 stars. No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher who wants to understand and apply modern transfer learning methods to improve your NLP models, especially when you have limited task-specific data.

Not ideal if you are looking for a state-of-the-art, production-ready NLP solution without needing to dive into the underlying model architectures and training processes.

Natural-Language-Processing Machine-Learning-Engineering Text-Classification Model-Training Deep-Learning
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 22 / 25

How are scores calculated?

Stars

722

Forks

121

Language

Python

License

MIT

Last pushed

Oct 16, 2019

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/huggingface/naacl_transfer_learning_tutorial"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.