ThilinaRajapakse/simpletransformers
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling, Language Generation, T5, Multi-Modal, and Conversational AI
This is a machine learning library for developers that simplifies using advanced Transformer models from HuggingFace for a variety of natural language processing tasks. It allows you to feed in text or image data and get out classified categories, generated text, or answers to questions, among other outputs. This tool is designed for developers, data scientists, and researchers who build applications involving text analysis, information retrieval, and conversational AI.
4,234 stars. Used by 4 other packages. No commits in the last 6 months. Available on PyPI.
Use this if you are a developer or data scientist who needs to quickly implement and fine-tune state-of-the-art Transformer models for NLP tasks with minimal code.
Not ideal if you are an end-user without programming experience, as this is a developer tool requiring Python knowledge.
Stars
4,234
Forks
721
Language
Python
License
Apache-2.0
Category
Last pushed
Aug 25, 2025
Commits (30d)
0
Dependencies
16
Reverse dependents
4
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/ThilinaRajapakse/simpletransformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related models
jsksxs360/How-to-use-Transformers
Transformers 库快速入门教程
google/deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences...
Denis2054/Transformers-for-NLP-2nd-Edition
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning,...
abhimishra91/transformers-tutorials
Github repo with tutorials to fine tune transformers for diff NLP tasks
Nicolepcx/transformers-the-definitive-guide
This is the official repository for the book Transformers - The Definitive Guide