explosion/spacy-transformers

🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy

53
/ 100
Established

This helps AI/ML engineers and data scientists enhance their natural language processing (NLP) applications. It allows them to integrate powerful, pre-trained language models like BERT or GPT-2 into their spaCy pipelines. This means you can feed text into a system and get richer, more nuanced linguistic insights out.

1,402 stars.

Use this if you are a data scientist or NLP engineer building robust text analysis systems with spaCy and want to leverage state-of-the-art transformer models for improved performance.

Not ideal if you primarily need predictions from pre-trained text or token classification models and don't require the detailed feature extraction for further spaCy components.

natural-language-processing text-analysis machine-learning data-science AI-development
No Package No Dependents
Maintenance 6 / 25
Adoption 10 / 25
Maturity 16 / 25
Community 21 / 25

How are scores calculated?

Stars

1,402

Forks

176

Language

Python

License

MIT

Last pushed

Nov 07, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/nlp/explosion/spacy-transformers"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.