rick12000/confopt
A Library for Conformal Hyperparameter Tuning
This tool helps machine learning practitioners fine-tune the settings (hyperparameters) of their machine learning models to get the best possible performance. You provide your model code and a range of settings to test, and it outputs the optimal settings and the best performance achieved. It's designed for data scientists and ML engineers looking to improve model accuracy or other metrics.
104 stars. Available on PyPI.
Use this if you need to find the best configuration for your machine learning model to maximize its performance, especially for tasks like classification or regression.
Not ideal if your primary need is parallelized tuning, multi-fidelity optimization, or multi-objective optimization without an Optuna integration.
Stars
104
Forks
5
Language
Python
License
MIT
Category
Last pushed
Nov 24, 2025
Commits (30d)
0
Dependencies
7
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/rick12000/confopt"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
zillow/quantile-forest
Quantile Regression Forests compatible with scikit-learn.
valeman/awesome-conformal-prediction
A professionally curated list of awesome Conformal Prediction videos, tutorials, books, papers,...
yromano/cqr
Conformalized Quantile Regression
henrikbostrom/crepes
Python package for conformal prediction
xRiskLab/pearsonify
Lightweight Python package for generating classification intervals in binary classification...