0x7o/RETRO-transformer

Easy-to-use Retrieval-Enhanced Transformer implementation

45
/ 100
Emerging

This project helps machine learning engineers and researchers build advanced language models that can retrieve information from large external text databases. You provide a large corpus of text and the tool helps you train a 'retrieval-enhanced' transformer model, which then outputs a more contextually aware and accurate language model. This is for those working on natural language processing applications where factual accuracy and external knowledge are critical.

No commits in the last 6 months. Available on PyPI.

Use this if you need to develop a transformer-based language model that can tap into vast amounts of external knowledge to improve its understanding and generation of text.

Not ideal if you are looking for an off-the-shelf pre-trained model or a simple fine-tuning solution without needing to build a custom retrieval mechanism.

natural-language-processing large-language-models information-retrieval machine-learning-engineering
Stale 6m No Dependents
Maintenance 0 / 25
Adoption 5 / 25
Maturity 25 / 25
Community 15 / 25

How are scores calculated?

Stars

10

Forks

4

Language

Python

License

Apache-2.0

Last pushed

Sep 30, 2022

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/0x7o/RETRO-transformer"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.