Debrup-61/RaDeR
Official Code Repositiry for "RaDeR: Reasoning-aware Dense Retrieval Models" accepted at Main Conference EMNLP 2025
This tool helps researchers and AI developers working with large language models (LLMs) to improve their models' ability to solve complex mathematical problems. It takes mathematical problem data and LLM reasoning steps, then generates specialized training data. The output is a highly effective 'reasoning-aware' retrieval model that can find relevant information to guide LLMs in solving diverse reasoning tasks.
No commits in the last 6 months.
Use this if you need to train LLMs to perform better on mathematical reasoning and similar complex problem-solving tasks by providing more relevant context.
Not ideal if you are looking for a plug-and-play solution for general information retrieval, as this is specifically designed for enhancing LLM mathematical reasoning.
Stars
16
Forks
1
Language
Python
License
MIT
Category
Last pushed
Jun 23, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/rag/Debrup-61/RaDeR"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
denser-org/denser-retriever
An enterprise-grade AI retriever designed to streamline AI integration into your applications,...
rayliuca/T-Ragx
Enhancing Translation with RAG-Powered Large Language Models
neuml/rag
🚀 Retrieval Augmented Generation (RAG) with txtai. Combine search and LLMs to find insights with...
NovaSearch-Team/RAG-Retrieval
Unify Efficient Fine-tuning of RAG Retrieval, including Embedding, ColBERT, ReRanker.
RulinShao/retrieval-scaling
Official repository for "Scaling Retrieval-Based Langauge Models with a Trillion-Token Datastore".