raminmh/liquid_time_constant_networks
Code Repository for Liquid Time-Constant Networks (LTCs)
This project helps researchers and machine learning practitioners benchmark and compare the performance of different continuous-time neural network models on various time-series prediction and classification tasks. You input raw time-series datasets (like sensor readings or gesture data) and it outputs model performance metrics (e.g., accuracy, loss) for different continuous-time network architectures. This is primarily for machine learning researchers or data scientists focused on advanced time-series analysis.
1,812 stars. No commits in the last 6 months.
Use this if you are a researcher or advanced practitioner experimenting with continuous-time models for sequence data and need to rigorously evaluate their performance against benchmarks.
Not ideal if you are looking for a plug-and-play solution for general time-series forecasting or classification without deep engagement with neural network architectures.
Stars
1,812
Forks
327
Language
Python
License
Apache-2.0
Category
Last pushed
Jun 03, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/raminmh/liquid_time_constant_networks"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Related frameworks
kaanaksit/odak
Scientific computing library for optics, computer graphics and visual perception.
NVIDIA/torch-harmonics
Differentiable signal processing on the sphere for PyTorch
PreFab-Photonics/PreFab
Artificial nanofabrication of integrated photonic circuits using deep learning
MatthewFilipovich/torchoptics
Differentiable wave optics simulation library built on PyTorch
artificial-scientist-lab/XLuminA
XLuminA, a highly-efficient, auto-differentiating discovery framework for super-resolution microscopy.