JuliaTrustworthyAI/LaplaceRedux.jl

Effortless Bayesian Deep Learning through Laplace Approximation for Flux.jl neural networks.

44
/ 100
Emerging

This tool helps machine learning practitioners evaluate how confident their deep learning models are in their predictions. You provide your pre-trained Flux.jl neural network and training data, and it outputs predictions with uncertainty estimates (like confidence intervals for regression or probability contours for classification). This is useful for anyone building and deploying deep learning models who needs to understand the reliability of those models' outputs.

Use this if you are a machine learning engineer or data scientist working with Julia and Flux.jl, and you need to quantify the uncertainty of your neural network's predictions.

Not ideal if you are not using Julia or Flux.jl for your deep learning models, or if you need highly complex or computationally intensive Bayesian deep learning methods beyond Laplace Approximation.

machine-learning-engineering deep-learning-evaluation predictive-modeling model-uncertainty julia-ecosystem
No Package No Dependents
Maintenance 10 / 25
Adoption 8 / 25
Maturity 16 / 25
Community 10 / 25

How are scores calculated?

Stars

49

Forks

5

Language

Julia

License

MIT

Last pushed

Feb 05, 2026

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/JuliaTrustworthyAI/LaplaceRedux.jl"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.