PKBoost-AI-Labs/PkBoost
PKBoost: Adaptive GBDT for Concept Drift, Built from scratch in Rust, PKBoost manages changing data distributions in fraud detection with a fraud rate of 0.2%. It shows less than 2% degradation under drift. In comparison, XGBoost experiences a 31.8% drop and LightGBM a 42.5% drop
PKBoost helps professionals building predictive models, especially for fraud or anomaly detection, by creating highly accurate models that automatically adapt to changing data patterns over time. You provide your historical data, and it outputs a robust predictive model capable of identifying rare events with high precision, even as trends shift. This is ideal for risk analysts, data scientists, and operations engineers who need reliable detection systems in dynamic environments.
Use this if you need to build a predictive model that must remain accurate in the face of evolving data, such as detecting new fraud patterns, identifying anomalies in real-time systems, or diagnosing diseases where indicators might subtly change.
Not ideal if your data distribution is completely static and unchanging, or if you primarily work with categorical features that would require extensive pre-processing.
Stars
66
Forks
5
Language
Rust
License
Apache-2.0
Category
Last pushed
Jan 27, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/PKBoost-AI-Labs/PkBoost"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
dmlc/xgboost
Scalable, Portable and Distributed Gradient Boosting (GBDT, GBRT or GBM) Library, for Python,...
catboost/catboost
A fast, scalable, high performance Gradient Boosting on Decision Trees library, used for...
stanfordmlgroup/ngboost
Natural Gradient Boosting for Probabilistic Prediction
lightgbm-org/LightGBM
A fast, distributed, high performance gradient boosting (GBT, GBDT, GBRT, GBM or MART) framework...
fabsig/GPBoost
Tree-Boosting, Gaussian Processes, and Mixed-Effects Models