EmanuelSommer/MILE

Code for the ICLR 2025 paper: "Microcanonical Langevin Ensembles: Advancing the Sampling of Bayesian Neural Networks"

35
/ 100
Emerging

This project helps machine learning practitioners more efficiently train Bayesian Neural Networks (BNNs). It takes your BNN configuration and data, then generates high-quality posterior samples and model diagnostics faster than traditional methods. The output includes trained models, posterior samples, performance metrics, and visualizations, enabling quicker development and evaluation of robust models.

No commits in the last 6 months.

Use this if you need to sample Bayesian Neural Networks and want to significantly reduce the time it takes to get high-quality results compared to existing methods like NUTS.

Not ideal if you are working with non-Bayesian neural networks or if your primary goal is not efficient sampling for uncertainty quantification.

Bayesian-modeling machine-learning-engineering predictive-analytics uncertainty-quantification model-training
Stale 6m No Package No Dependents
Maintenance 0 / 25
Adoption 4 / 25
Maturity 16 / 25
Community 15 / 25

How are scores calculated?

Stars

8

Forks

4

Language

Python

License

MIT

Last pushed

Feb 27, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/EmanuelSommer/MILE"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.