LingFengGold/TimeDistill

[KDD 2026] Official implementation of "TimeDistill: Efficient Long-Term Time Series Forecasting with MLPs via Cross-Architecture Distillation"

42
/ 100
Emerging

This tool helps data scientists and machine learning engineers make accurate long-term predictions from time series data using less computational power. It takes historical time series datasets and a 'teacher' model (like a Transformer or CNN), then outputs a more efficient, lightweight prediction model (an MLP) that performs as well as, or better than, the complex teacher. This is ideal for those managing predictive analytics in resource-constrained environments.

Use this if you need to forecast long-term trends in time series data with high accuracy but are limited by computational resources or model size.

Not ideal if you are working with short-term forecasts or if you have ample computational resources and prefer to directly deploy large, complex models.

time-series-forecasting predictive-analytics resource-optimization machine-learning-engineering data-science
No Package No Dependents
Maintenance 6 / 25
Adoption 6 / 25
Maturity 16 / 25
Community 14 / 25

How are scores calculated?

Stars

15

Forks

3

Language

Python

License

Apache-2.0

Last pushed

Nov 27, 2025

Commits (30d)

0

Get this data via API

curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/LingFengGold/TimeDistill"

Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.