pphuc25/distil-cd
Distillation Contrastive Decoding: Improving LLMs Reasoning with Contrastive Decoding and Distillation
This tool helps improve the reasoning abilities of large language models (LLMs) when generating text, especially for complex questions. It takes an existing LLM and a prompt as input, then produces more accurate and logical answers by minimizing errors during the LLM's thought process. Machine learning engineers and researchers working with LLMs would find this useful.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher looking to enhance the reasoning and accuracy of your large language models, particularly for tasks requiring arithmetic or commonsense reasoning.
Not ideal if you are a non-technical user looking for a ready-to-use application, as this is a developer-focused library requiring coding knowledge.
Stars
35
Forks
1
Language
Python
License
—
Category
Last pushed
Feb 27, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/pphuc25/distil-cd"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
microsoft/AdaMix
This is the implementation of the paper AdaMix: Mixture-of-Adaptations for Parameter-efficient...
taissirboukrouba/Structured-Information-Retrieval-with-LLMs
Academic Sequence Labelling Between DistillBERT & Encoder-only Transformer
mominalix/LLM-Model-Distillation-for-Text-Classification-Models-GUI
GUI application that performs knowledge distillation from OpenAI models to smaller Hugging Face...