softmin/ReHLine
Regularized Composite ReLU-ReHU Loss Minimization with Linear Computation and Linear Convergence
ReHLine helps machine learning practitioners build models like SVMs, quantile regression, or Huber regression with speed and precision, even with large datasets or fairness requirements. It takes your raw data and desired model type, then quickly produces an optimized model ready for predictions. This tool is for data scientists, ML engineers, and researchers who need to train robust, high-performing predictive models for classification, regression, and risk assessment.
Use this if you need to train machine learning models for classification, regression, or risk assessment using piecewise linear-quadratic loss functions and linear constraints, and require exceptional speed and scalability, especially with large datasets or specific fairness requirements.
Not ideal if your problem involves highly complex, non-linear optimization or if you are not working with structured, numerical data for predictive modeling.
Stars
47
Forks
1
Language
—
License
MIT
Category
Last pushed
Dec 16, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/softmin/ReHLine"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Compare
Higher-rated alternatives
jolars/sortedl1
Python package for Sorted L-One Penalized Estimation (SLOPE)
gugarosa/opytimizer
🐦 Opytimizer is a Python library consisting of meta-heuristic optimization algorithms.
SENATOROVAI/gradient-descent-sgd-solver-course
Stochastic Gradient Descent (SGD) is an optimization algorithm that updates model parameters...
hiroyuki-kasai/SGDLibrary
MATLAB/Octave library for stochastic optimization algorithms: Version 1.0.20
softmin/ReHLine-python
Regularized Composite ReLU-ReHU Loss Minimization with Linear Computation and Linear Convergence