explosion/spacy-transformers
🛸 Use pretrained transformers like BERT, XLNet and GPT-2 in spaCy
This helps AI/ML engineers and data scientists enhance their natural language processing (NLP) applications. It allows them to integrate powerful, pre-trained language models like BERT or GPT-2 into their spaCy pipelines. This means you can feed text into a system and get richer, more nuanced linguistic insights out.
1,402 stars.
Use this if you are a data scientist or NLP engineer building robust text analysis systems with spaCy and want to leverage state-of-the-art transformer models for improved performance.
Not ideal if you primarily need predictions from pre-trained text or token classification models and don't require the detailed feature extraction for further spaCy components.
Stars
1,402
Forks
176
Language
Python
License
MIT
Category
Last pushed
Nov 07, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/nlp/explosion/spacy-transformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related tools
nltk/nltk
NLTK Source
explosion/spaCy
💫 Industrial-strength Natural Language Processing (NLP) in Python
undertheseanlp/underthesea
Underthesea - Vietnamese NLP Toolkit
stanfordnlp/stanza
Stanford NLP Python library for tokenization, sentence segmentation, NER, and parsing of many...
flairNLP/flair
A very simple framework for state-of-the-art Natural Language Processing (NLP)