rick12000/confopt

A Library for Conformal Hyperparameter Tuning

47
/ 100
Emerging

This tool helps machine learning practitioners fine-tune the settings (hyperparameters) of their machine learning models to get the best possible performance. You provide your model code and a range of settings to test, and it outputs the optimal settings and the best performance achieved. It's designed for data scientists and ML engineers looking to improve model accuracy or other metrics.

104 stars. Available on PyPI.

Use this if you need to find the best configuration for your machine learning model to maximize its performance, especially for tasks like classification or regression.

Not ideal if your primary need is parallelized tuning, multi-fidelity optimization, or multi-objective optimization without an Optuna integration.

machine-learning-optimization model-tuning data-science predictive-modeling ML-engineering
Maintenance 6 / 25
Adoption 9 / 25
Maturity 25 / 25
Community 7 / 25

How are scores calculated?

Stars

104

Forks

5

Language

Python

License

MIT

Last pushed

Nov 24, 2025

Commits (30d)

0

Dependencies

7

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/rick12000/confopt"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.