mims-harvard/TFC-pretraining
Self-supervised contrastive learning for time series via time-frequency consistency
This project helps scientists and engineers generalize knowledge from one set of time series data to another, even when data types or conditions vary. You provide diverse time series measurements, like sensor readings or medical signals, and it produces a 'pre-trained' model. This model can then be adapted more easily to new, smaller datasets for tasks such as identifying a seizure, recognizing a gesture, or detecting machinery faults.
519 stars. No commits in the last 6 months.
Use this if you need to train models for time series classification tasks, especially when you have limited labeled data for your specific problem but access to a lot of unlabeled time series data.
Not ideal if your problem does not involve time series data or if you have a very large, well-labeled dataset for your specific task and do not need to transfer knowledge from other time series.
Stars
519
Forks
92
Language
Python
License
MIT
Category
Last pushed
May 07, 2024
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/ml-frameworks/mims-harvard/TFC-pretraining"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
AdaptiveMotorControlLab/CEBRA
Learnable latent embeddings for joint behavioral and neural analysis - Official implementation of CEBRA
theolepage/sslsv
Toolkit for training and evaluating Self-Supervised Learning (SSL) frameworks for Speaker...
PaddlePaddle/PASSL
PASSL包含 SimCLR,MoCo v1/v2,BYOL,CLIP,PixPro,simsiam, SwAV, BEiT,MAE 等图像自监督算法以及 Vision...
YGZWQZD/LAMDA-SSL
30 Semi-Supervised Learning Algorithms
ModSSC/ModSSC
ModSSC: A Modular Framework for Semi Supervised Classification