tobifinn/ensemble_transformer
Official PyTorch implementation of "Self-Attentive Ensemble Transformer: Representing Ensemble Interactions in Neural Networks for Earth System Models".
This tool helps climate scientists and meteorologists process complex ensemble data from Earth system models, such as weather forecast outputs. It takes raw or pre-processed ensemble data (like ERA5 or IFS) and applies a novel neural network approach to generate improved, non-parametric forecasts or analyses. Earth system modelers can use this to enhance the accuracy and interpretation of their model predictions.
No commits in the last 6 months.
Use this if you need to analyze or post-process ensemble data from climate or weather models using advanced neural network techniques, moving beyond traditional parametric assumptions.
Not ideal if you prefer simpler, more interpretable statistical post-processing methods or if your primary focus is on models that don't involve ensemble data.
Stars
14
Forks
1
Language
Jupyter Notebook
License
MIT
Category
Last pushed
Sep 28, 2022
Commits (30d)
0
Get this data via API
curl "https://pt-edge.onrender.com/api/v1/quality/transformers/tobifinn/ensemble_transformer"
Open to everyone — 100 requests/day, no key needed. Get a free key for 1,000/day.
Higher-rated alternatives
microsoft/LoRA
Code for loralib, an implementation of "LoRA: Low-Rank Adaptation of Large Language Models"
jadore801120/attention-is-all-you-need-pytorch
A PyTorch implementation of the Transformer model in "Attention is All You Need".
bhavnicksm/vanilla-transformer-jax
JAX/Flax implimentation of 'Attention Is All You Need' by Vaswani et al....
kyegomez/SparseAttention
Pytorch Implementation of the sparse attention from the paper: "Generating Long Sequences with...
AbdelStark/attnres
Rust implementation of Attention Residuals from MoonshotAI/Kimi