0x7o/RETRO-transformer
Easy-to-use Retrieval-Enhanced Transformer implementation
This project helps machine learning engineers and researchers build advanced language models that can retrieve information from large external text databases. You provide a large corpus of text and the tool helps you train a 'retrieval-enhanced' transformer model, which then outputs a more contextually aware and accurate language model. This is for those working on natural language processing applications where factual accuracy and external knowledge are critical.
No commits in the last 6 months. Available on PyPI.
Use this if you need to develop a transformer-based language model that can tap into vast amounts of external knowledge to improve its understanding and generation of text.
Not ideal if you are looking for an off-the-shelf pre-trained model or a simple fine-tuning solution without needing to build a custom retrieval mechanism.
Stars
10
Forks
4
Language
Python
License
Apache-2.0
Category
Last pushed
Sep 30, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/0x7o/RETRO-transformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action