Prithvi WxC
Foundation model for weather and climate, that employs an encoder-decoder-based architecture, incorporating concepts from various recent transformer models to effectively capture both regional and global dependencies in the input data.
https://github.com/nasa-impact/prithvi-wxc
Category: Atmosphere
Sub Category: Meteorological Observation and Forecast
Last synced: about 5 hours ago
JSON representation
Repository metadata
Implementation of the Prithvi WxC Foundation Model and Downstream Tasks
- Host: GitHub
- URL: https://github.com/nasa-impact/prithvi-wxc
- Owner: NASA-IMPACT
- License: mit
- Created: 2024-08-30T16:23:06.000Z (8 months ago)
- Default Branch: main
- Last Pushed: 2025-03-26T02:18:57.000Z (about 1 month ago)
- Last Synced: 2025-04-22T00:04:58.137Z (5 days ago)
- Language: Python
- Size: 57.8 MB
- Stars: 140
- Watchers: 6
- Forks: 26
- Open Issues: 7
- Releases: 2
-
Metadata Files:
- Readme: README.md
- License: LICENSE
README.md
Prithvi WxC: Foundation model for weather and climate
This repository contains the code of the Prithvi WxC foundation model as well as a basic zero-shot examples for testing and illustration. For fine-tuning applications please refer to task-specific repositories listed below.
Updates
March 25, 2025
The previous version of this repo contained a number of bugs that led to incorrect model outputs and worse performance than in our paper. We just addressed these issues. In particular, there is validation code below that lets you verify whether your particular platform and version of the code obtains results comparable to ours. (See step 3 under Getting started.)
Architecture overview: A scalable and flexible vision transformer
Prithvi WxC is at its core a scalable 2D vision transformer. The architecture is designed to allow for memory-efficient masked pretraining. It draws inspiration from both Hiera, MaxViT and SWIN transformers. Inputs, structured into windows, take the shape (batch, windows, tokens, features). We alternate between local attention (within a window) and global attention (across windows). This is implemented by transposing dimensions between transformer layers. Attention acts on the third dimension, the second being part of the batch. When data becomes dense -- i.e. in the absence of masking -- it is possible to add SWIN-like shifts to the local attention layers. See the figure for illustration:
Fine-tuning applications
We have fine-tuned the model to a number of downstream tasks. See the paper as well as the respective repository for details.
Application | Dataset | Repository |
---|---|---|
Downscaling | MERRA-2 | https://github.com/IBM/granite-wxc |
EURO-CORDEX | https://github.com/IBM/granite-wxc | |
Gravity wave parametrization | ERA5 | https://github.com/NASA-IMPACT/gravity-wave-finetuning |
Beyond these there are zero-shot applications in masked reconstruction and forecasting.
Getting started
-
Create a virtual environment
-
Clone this repository and install Prithvi WxC as a module
git clone https://github.com/NASA-IMPACT/Prithvi-WxC cd Prithvi-WxC pip install '.[examples]'
-
Validate that the model behaves as expected. For that run
python -m validation.validate_prithvi_wxc -c validation/config.yaml
-
Run one of the notebooks in the
examples
directory:These notebooks will download model weights as well as sample data for basic illustration from Hugging Face.
Pretrained models
Prithvi WxC is a very flexible model. It has been pretrained on a pretext task blending masked reconstruction and forecasting so that it can be used for both zero-hours ahead as well as forecasting applications. Moreover, the masking pattern makes it suitable for both global and regional applications. There are currently two pretrained base models as well as several fine-tuning applications.
Model | Details | Weights |
---|---|---|
prithvi.wxc.2300m.v1 | Pretrained 2.3B parameter model. Flexible input and lead time. For general and 0-hour ahead applications. | https://huggingface.co/Prithvi-WxC/prithvi.wxc.2300m.v1 |
prithvi.wxc.rollout.2300m.v1 | Pretrained 2.3B parameter model. Input and lead time fixed to 6h. For forecasting applications. | https://huggingface.co/Prithvi-WxC/prithvi.wxc.rollout.2300m.v1 |
Data
Prithvi WxC used data from the MERRA-2 reanalysis for pretraining. In particular, the model uses a climatology computed from MERRA-2 data. The climatology, too, is available via Hugging Face. See the paper for details on variables choosen and the methodology behind the climatology.
Citation
If you use this work, consider citing our paper
@misc{schmude2024prithviwxcfoundationmodel,
title={Prithvi WxC: Foundation Model for Weather and Climate},
author={Johannes Schmude and Sujit Roy and Will Trojak and Johannes Jakubik and Daniel Salles Civitarese and Shraddha Singh and Julian Kuehnert and Kumar Ankur and Aman Gupta and Christopher E Phillips and Romeo Kienzler and Daniela Szwarcman and Vishal Gaur and Rajat Shinde and Rohit Lal and Arlindo Da Silva and Jorge Luis Guevara Diaz and Anne Jones and Simon Pfreundschuh and Amy Lin and Aditi Sheshadri and Udaysankar Nair and Valentine Anantharaj and Hendrik Hamann and Campbell Watson and Manil Maskey and Tsengdar J Lee and Juan Bernabe Moreno and Rahul Ramachandran},
year={2024},
eprint={2409.13598},
archivePrefix={arXiv},
primaryClass={cs.LG},
url={https://arxiv.org/abs/2409.13598},
}
Owner metadata
- Name: Inter Agency Implementation and Advanced Concepts
- Login: NASA-IMPACT
- Email: [email protected]
- Kind: organization
- Description:
- Website:
- Location:
- Twitter:
- Company:
- Icon url: https://avatars.githubusercontent.com/u/22798984?v=4
- Repositories: 88
- Last ynced at: 2023-03-04T07:55:15.502Z
- Profile URL: https://github.com/NASA-IMPACT
GitHub Events
Total
- Create event: 7
- Release event: 3
- Issues event: 15
- Watch event: 43
- Delete event: 4
- Issue comment event: 10
- Push event: 13
- Pull request review comment event: 2
- Pull request review event: 5
- Pull request event: 18
- Fork event: 14
Last Year
- Create event: 7
- Release event: 3
- Issues event: 15
- Watch event: 43
- Delete event: 4
- Issue comment event: 10
- Push event: 13
- Pull request review comment event: 2
- Pull request review event: 5
- Pull request event: 18
- Fork event: 14
Committers metadata
Last synced: 5 days ago
Total Commits: 87
Total Committers: 7
Avg Commits per committer: 12.429
Development Distribution Score (DDS): 0.529
Commits in past year: 87
Committers in past year: 7
Avg Commits per committer in past year: 12.429
Development Distribution Score (DDS) in past year: 0.529
Name | Commits | |
---|---|---|
Johannes Schmude | J****e@i****m | 41 |
Will Trojak | w****k@i****m | 22 |
Romeo Kienzler | r****1@i****m | 10 |
Lelouch vi' Britania | s****4@h****m | 7 |
Daniel Salles Civitarese | 1****e | 5 |
lino | 4****i | 1 |
Julian Kuehnert | J****u | 1 |
Committer domains:
- ibm.com: 3
Issue and Pull Request metadata
Last synced: 1 day ago
Total issues: 12
Total pull requests: 45
Average time to close issues: 11 days
Average time to close pull requests: about 12 hours
Total issue authors: 8
Total pull request authors: 7
Average comments per issue: 0.75
Average comments per pull request: 0.33
Merged pull request: 37
Bot issues: 0
Bot pull requests: 0
Past year issues: 12
Past year pull requests: 45
Past year average time to close issues: 11 days
Past year average time to close pull requests: about 12 hours
Past year issue authors: 8
Past year pull request authors: 7
Past year average comments per issue: 0.75
Past year average comments per pull request: 0.33
Past year merged pull request: 37
Past year bot issues: 0
Past year bot pull requests: 0
Top Issue Authors
- WillTrojak (4)
- ShileiCao (2)
- rubencart (1)
- taylorfturner (1)
- gajeshladhar (1)
- sdash77 (1)
- whpy (1)
- seangtkelley (1)
Top Pull Request Authors
- WillTrojak (21)
- dancivitarese (9)
- romeokienzler (8)
- johannesschmude (4)
- Jubeku (1)
- michelebanfi (1)
- take2rohit (1)
Top Issue Labels
- bug (1)
Top Pull Request Labels
Dependencies
- Sphinx >=8.0.2
- graphviz *
- mkdocs *
- pytools >=2016.2.1
- sphinx-rtd-theme *
- sphinxcontrib-napoleon *
- h5py ~= 3.3
- numpy ~= 1.26
- pandas ~= 2.2
- torch >= 2.2
Score: 6.9363427358340495