louisbrulenaudet/tsdae
Transformer-based Denoising AutoEncoder for Sentence Transformers Unsupervised pre-training.
This project helps train models to understand the meaning of sentences without needing a lot of manually labeled data. It takes raw text and produces specialized sentence embedding models that capture the semantic essence of your text. Data scientists and machine learning engineers who need to work with large volumes of text but lack extensive labeled datasets would find this useful.
Use this if you need to create high-quality sentence embeddings from unlabeled text data for downstream tasks like search, clustering, or classification.
Not ideal if you already have plenty of labeled data for your specific text understanding task or if you require an off-the-shelf solution without custom model training.
Stars
9
Forks
3
Language
Python
License
Apache-2.0
Category
Last pushed
Nov 17, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/louisbrulenaudet/tsdae"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ThilinaRajapakse/simpletransformers
Transformers for Information Retrieval, Text Classification, NER, QA, Language Modelling,...
jsksxs360/How-to-use-Transformers
Transformers 库快速入门教程
google/deepconsensus
DeepConsensus uses gap-aware sequence transformers to correct errors in Pacific Biosciences...
Denis2054/Transformers-for-NLP-2nd-Edition
Transformer models from BERT to GPT-4, environments from Hugging Face to OpenAI. Fine-tuning,...
abhimishra91/transformers-tutorials
Github repo with tutorials to fine tune transformers for diff NLP tasks