pphuc25/distil-cd

Distillation Contrastive Decoding: Improving LLMs Reasoning with Contrastive Decoding and Distillation

18
/ 100
Experimental

This tool helps improve the reasoning abilities of large language models (LLMs) when generating text, especially for complex questions. It takes an existing LLM and a prompt as input, then produces more accurate and logical answers by minimizing errors during the LLM's thought process. Machine learning engineers and researchers working with LLMs would find this useful.

No commits in the last 6 months.

Use this if you are a machine learning engineer or researcher looking to enhance the reasoning and accuracy of your large language models, particularly for tasks requiring arithmetic or commonsense reasoning.

Not ideal if you are a non-technical user looking for a ready-to-use application, as this is a developer-focused library requiring coding knowledge.

large-language-models natural-language-processing AI-reasoning machine-learning-engineering
No License Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 7 / 25
Maturity 8 / 25
Community 3 / 25

How are scores calculated?

Stars

35

Forks

1

Language

Python

License

Last pushed

Feb 27, 2024

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/transformers/pphuc25/distil-cd"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.