A curated list of open technology projects to sustain a stable climate, energy supply, biodiversity and natural resources.

STARCOP

Semantic Segmentation of Methane Plumes with Hyperspectral Machine Learning models.
https://github.com/spaceml-org/starcop

Category: Emissions
Sub Category: Emission Observation and Modeling

Keywords

aviris aviris-ng emit hyperspectral hyperspectral-datasets machine-learning methane methane-detection

Last synced: about 3 hours ago
JSON representation

Repository metadata

Official code for STARCOP: Semantic Segmentation of Methane Plumes with Hyperspectral Machine Learning models :rainbow::artificial_satellite:

README.md

STARCOP


Semantic Segmentation of Methane Plumes with Hyperspectral ML Models

Dataset

The full annotated dataset used for training and evaluation is hosted on Zenodo. For easier access to the data for the demos, a smaller subsets are also hosted on Google Drive: evaluation dataset and subset of the training dataset, including only strong plumes.
We provide selected AVIRIS-NG hyperspectral bands, computed methane enhancement products and simulated multispectral views of the data from WorldView-3. For more details see the paper.

All bands: If you'd like to use more AVIRIS-NG bands, please contact us for instructions on downloading the full data (a preview of the formatting in a mini dataset is also available here).

For dataset inspection use the prepared Colab Dataset Exploration demo .

Code examples

Install

conda create -c conda-forge -n starcop_env python=3.10 mamba
conda activate starcop_env

pip install git+https://github.com/spaceml-org/STARCOP.git

Inference

To start using our model for inference, you can check the demo with AVIRIS data in Colab Inference on AVIRIS , or with EMIT data in Colab Inference on EMIT . These download our annotated datasets and demonstrate the performance of our pre-trained models.

Our trained models are stored in Hugging Face🤗 at isp-uv-es/starcop.

Training

To reproduce the same training process as reported in the paper, you will need to download the whole STARCOP dataset from Zenodo first, and prepare the coding environment.

# Check possible parameters with:
!python3 -m scripts.train --help

# Or run the prepared training script used for the paper models (remember to download and adjust the paths to the training datasets)
./bash/bash_train_example.sh

Minimal training example

If you install the environment using the commands above, this should work as a minimal training example (which includes first getting the data):

gdown https://drive.google.com/uc?id=1Qw96Drmk2jzBYSED0YPEUyuc2DnBechl -O STARCOP_mini.zip
unzip -q STARCOP_mini.zip
# The train script will expect the test dataset in the "test.csv" - so here in this small demo we just place the small subset there instead:
cp STARCOP_mini/test_mini10.csv STARCOP_mini/test.csv

python -m scripts.train dataset.input_products=["mag1c","TOA_AVIRIS_640nm","TOA_AVIRIS_550nm","TOA_AVIRIS_460nm"] model.model_type='unet_semseg' model.pos_weight=1 experiment_name="HyperSTARCOP_magic_rgb_DEMO" dataloader.num_workers=4 dataset.use_weight_loss=True training.val_check_interval=0.5 training.max_epochs=5 products_plot=["rgb_aviris","mag1c","label","pred","differences"] dataset.weight_sampling=True dataset.train_csv="train_mini10.csv" dataset.root_folder=PATH_TO/STARCOP_mini wandb.wandb_entity="YOUR_ENTITY" wandb.wandb_project="starcop_project"

Citation

If you find the STARCOP models or dataset useful in your research, please consider citing our work.

@article{ruzicka_starcop_2023,
	title = {Semantic segmentation of methane plumes with hyperspectral machine learning models},
	volume = {13},
	issn = {2045-2322},
	url = {https://www.nature.com/articles/s41598-023-44918-6},
	doi = {10.1038/s41598-023-44918-6},
	number = {1},
	journal = {Scientific Reports},
        author={Růžička, Vít and Mateo-Garcia, Gonzalo and G{\'o}mez-Chova, Luis and Vaughan, Anna and Guanter, Luis and Markham, Andrew},
	month = nov,
	year = {2023},
	pages = {19999}
}

Acknowledgments

This research has been funded by ESA Cognitive Cloud Computing in Space initiative project number STARCOP I-2022-00380. It has been supported by the DEEPCLOUD project (PID2019-109026RB-I00) funded by the Spanish Ministry of Science and Innovation (MCIN/AEI/10.13039/501100011033) and the European Union (NextGenerationEU).


Owner metadata


GitHub Events

Total
Last Year

Committers metadata

Last synced: 6 days ago

Total Commits: 24
Total Committers: 3
Avg Commits per committer: 8.0
Development Distribution Score (DDS): 0.375

Commits in past year: 6
Committers in past year: 2
Avg Commits per committer in past year: 3.0
Development Distribution Score (DDS) in past year: 0.333

Name Email Commits
Vit Ruzicka p****s@g****m 15
Gonzalo Mateo Garcia g****a@u****g 5
Gonzalo Mateo García g****8@g****m 4

Committer domains:


Issue and Pull Request metadata

Last synced: 1 day ago

Total issues: 6
Total pull requests: 1
Average time to close issues: about 1 month
Average time to close pull requests: 6 days
Total issue authors: 5
Total pull request authors: 1
Average comments per issue: 1.17
Average comments per pull request: 0.0
Merged pull request: 1
Bot issues: 0
Bot pull requests: 0

Past year issues: 4
Past year pull requests: 0
Past year average time to close issues: 7 days
Past year average time to close pull requests: N/A
Past year issue authors: 3
Past year pull request authors: 0
Past year average comments per issue: 0.75
Past year average comments per pull request: 0
Past year merged pull request: 0
Past year bot issues: 0
Past year bot pull requests: 0

More stats: https://issues.ecosyste.ms/repositories/lookup?url=https://github.com/spaceml-org/starcop

Top Issue Authors

  • yaoyuan10475 (2)
  • heyref (1)
  • 1750585724 (1)
  • klaraklycke (1)
  • huang-2 (1)

Top Pull Request Authors

  • gonzmg88 (1)

Top Issue Labels

Top Pull Request Labels


Dependencies

requirements.txt pypi
  • fsspec *
  • gcsfs *
  • geopandas *
  • hydra-core *
  • ipykernel *
  • ipython *
  • kornia ==0.6.7
  • matplotlib *
  • omegaconf *
  • pytorch-lightning ==1.6.4
  • rasterio *
  • scikit-image *
  • scikit-learn *
  • segmentation_models_pytorch *
  • torch ==1.13.1
  • torchmetrics ==0.10.0
  • wandb *
  • wandb ==0.13.3
requirements_package.txt pypi
  • fsspec *
  • gcsfs *
  • geopandas *
  • georeader-spaceml *
  • hydra-core *
  • ipykernel *
  • ipython *
  • kornia *
  • matplotlib *
  • omegaconf *
  • pytorch-lightning *
  • rasterio *
  • scikit-image *
  • scikit-learn *
  • segmentation_models_pytorch *
  • torch *
  • torchmetrics *
  • wandb *
setup.py pypi

Score: 5.030437921392435