yang-ai-lab/OSF-Open-Sleep-FM
OSF: On Pre-training and Scaling of Sleep Foundation Models
This project offers a family of sleep Foundation Models (OSF) designed to overcome variability in sleep study data from different recording devices and patient groups. It takes raw polysomnography (PSG) signals as input and outputs generalizable representations (embeddings) that can be used for various sleep and disease prediction tasks. Researchers, clinicians, and data scientists working with sleep health data would use this to build more accurate and robust models.
Use this if you need to analyze polysomnography (PSG) data and build predictive models for sleep stages or sleep-related conditions, especially when dealing with data from diverse sources and device types.
Not ideal if you are looking for a ready-to-use diagnostic tool rather than a foundational model to build upon with your specific downstream tasks.
Stars
10
Forks
1
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Mar 12, 2026
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/yang-ai-lab/OSF-Open-Sleep-FM"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
scaleapi/llm-engine
Scale LLM Engine public repository
AGI-Arena/MARS
The official implementation of MARS: Unleashing the Power of Variance Reduction for Training Large Models
modelscope/easydistill
a toolkit on knowledge distillation for large language models
AGI-Edgerunners/LLM-Adapters
Code for our EMNLP 2023 Paper: "LLM-Adapters: An Adapter Family for Parameter-Efficient...
Wang-ML-Lab/bayesian-peft
Bayesian Low-Rank Adaptation of LLMs: BLoB [NeurIPS 2024] and TFB [NeurIPS 2025]