Bradley-Butcher/Conformers
Unofficial implementation of Conformal Language Modeling by Quach et al
This project helps machine learning engineers and researchers explore Conformal Language Modelling. It takes a pre-trained language model and a set of calibration prompts as input, then outputs calibrated parameters that help the model generate text with more reliable confidence estimates. The primary users are those experimenting with ways to make large language models more trustworthy and interpretable.
No commits in the last 6 months.
Use this if you are a machine learning engineer or researcher looking to apply and experiment with conformal prediction techniques to improve the reliability of your language models' outputs.
Not ideal if you are an end-user seeking a ready-to-use, robust solution for general text generation or if you are not comfortable working with experimental machine learning implementations.
Stars
29
Forks
3
Language
Python
License
—
Category
Last pushed
Jul 15, 2023
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/Bradley-Butcher/Conformers"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
lucidrains/x-transformers
A concise but complete full-attention transformer with a set of promising experimental features...
kanishkamisra/minicons
Utility for behavioral and representational analyses of Language Models
lucidrains/simple-hierarchical-transformer
Experiments around a simple idea for inducing multiple hierarchical predictive model within a GPT
lucidrains/dreamer4
Implementation of Danijar's latest iteration for his Dreamer line of work
Nicolepcx/Transformers-in-Action
This is the corresponding code for the book Transformers in Action