Qingrenn/TSFM-ScalingLaws
[ICLR 2025] Official implementation of "Towards Neural Scaling Laws for Time Series Foundation Models"
This project helps machine learning researchers understand how the performance of time series forecasting models changes with the amount of training data and model size. You provide diverse time series datasets and model configurations. The project then trains these models, collects performance metrics, and helps you visualize the relationships between model size, data size, and performance. Researchers studying deep learning for time series will find this useful.
Use this if you are a machine learning researcher exploring the fundamental scaling laws of time series foundation models and want to conduct systematic experiments and analysis.
Not ideal if you are looking for a pre-packaged tool to apply existing time series models to a business problem without deep research into model architecture or scaling.
Stars
22
Forks
2
Language
Jupyter Notebook
License
Apache-2.0
Category
Last pushed
Oct 15, 2025
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/Qingrenn/TSFM-ScalingLaws"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
ntucllab/libact
Pool-based active learning in Python
scikit-activeml/scikit-activeml
scikit-activeml: A Comprehensive and User-friendly Active Learning Library
python-adaptive/adaptive
:chart_with_upwards_trend: Adaptive: parallel active learning of mathematical functions
NUAA-AL/ALiPy
ALiPy: Active Learning in Python is an active learning python toolbox, which allows users to...
ai4co/awesome-fm4co
Recent research papers about Foundation Models for Combinatorial Optimization