KoreaMGLEE/Concept-based-curriculum-masking

Efficient Pre-training of Masked Language Model via Concept-based Curriculum Masking

19
/ 100
Experimental

This project helps machine learning engineers pre-train masked language models more efficiently. It takes a raw text corpus and ConceptNet knowledge graph as input, then outputs a pre-trained language model ready for downstream natural language processing tasks. It's designed for ML engineers, NLP researchers, and data scientists working with large language models.

No commits in the last 6 months.

Use this if you need to pre-train a transformer-based masked language model but have limited computational resources and want to achieve comparable performance to standard methods with less compute.

Not ideal if you are looking for an out-of-the-box solution for fine-tuning an existing language model, or if your primary goal is not pre-training a new model from scratch.

natural-language-processing language-model-training computational-efficiency machine-learning-engineering text-analytics
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 8 / 25
Community 6 / 25

How are scores calculated?

Stars

13

Forks

1

Language

Python

License

Last pushed

Feb 05, 2023

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/KoreaMGLEE/Concept-based-curriculum-masking"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.