A curated list of open technology projects to sustain a stable climate, energy supply, biodiversity and natural resources.

MPAS-Analysis

Analysis for simulations produced with Model for Prediction Across Scales (MPAS) components and the Energy Exascale Earth System Model (E3SM), which used those components.
https://github.com/mpas-dev/mpas-analysis

Category: Climate Change
Sub Category: Climate Data Processing and Analysis

Keywords

climate-analysis climate-model e3sm mpas mpas-analysis

Keywords from Contributors

climate climate-science verification ice-sheet-models

Last synced: about 2 hours ago
JSON representation

Repository metadata

Provides analysis for the MPAS components of E3SM

README.md

MPAS-Analysis

Analysis for simulations produced with Model for Prediction Across Scales
(MPAS) components and the Energy Exascale Earth System Model (E3SM), which
used those components.

sea surface temperature

conda-forge

Current build status

Current release info

Name Downloads Version Platforms
Conda Recipe Conda Downloads Conda Version Conda Platforms

Documentation

https://mpas-dev.github.io/MPAS-Analysis/develop/

Installation for users

MPAS-Analysis is available as an anaconda package via the conda-forge channel:

conda config --add channels conda-forge
conda create -n mpas-analysis mpas-analysis
conda activate mpas-analysis

Installation for developers

To use the latest version for developers, get the code from:
https://github.com/MPAS-Dev/MPAS-Analysis

Then, you will need to set up a conda environment from the MPAS-Analysis repo.
This environment will include the required dependencies for the development
branch from dev-spec.txt and will install the mpas_analysis package into
the conda environment in a way that points directly to the local branch (so
changes you make to the code directly affect mpas_analysis in the conda
environment):

conda config --add channels conda-forge
conda config --set channel_priority strict
conda create -y -n mpas_analysis_dev --file dev-spec.txt
conda activate mpas_analysis_dev
python -m pip install --no-deps --no-build-isolation -e .

If you are developing another conda package at the same time (this is common
for MPAS-Tools or geometric_features), you should first comment out the other
package in dev-spec.txt. Then, you can install both packages in the same
development environment, e.g.:

conda create -y -n mpas_analysis_dev --file tools/MPAS-Tools/conda_package/dev-spec.txt \
    --file analysis/MPAS-Analysis/dev-spec.txt
conda activate mpas_analysis_dev
cd tools/MPAS-Tools/conda_package
python -m pip install --no-deps --no-build-isolation -e .
cd ../../../analysis/MPAS-Analysis
python -m pip install --no-deps --no-build-isolation -e .

Obviously, the paths to the repos may be different in your local clones. With
the mpas_analysis_dev environment as defined above, you can make changes to both
mpas_tools and mpas-analysis packages in their respective branches, and
these changes will be reflected when refer to the packages or call their
respective entry points (command-line tools).

Download analysis input data

If you installed the mpas-analysis package, download the data that is
necessary to MPAS-Analysis by running:

download_analysis_data -o /path/to/mpas_analysis/diagnostics

where /path/to/mpas_analysis/diagnostics is the main folder that will contain
two subdirectories:

  • mpas_analysis, which includes mapping and region mask files for
    standard resolution MPAS meshes
  • observations, which includes the pre-processed observations listed in the
    Observations table
    and used to evaluate the model results

Once you have downloaded the analysis data, you will point to its location
(your equivalent of path/to/mpas_analysis/diagnostics above) in the config
option baseDirectory in the [diagnostics] section.

List Analysis

If you installed the mpas-analysis package, list the available analysis tasks
by running:

mpas_analysis --list

This lists all tasks and their tags. These can be used in the generate
command-line option or config option. See mpas_analysis/default.cfg
for more details.

Running the analysis

  1. Create and empty config file (say myrun.cfg), copy example.cfg,
    or copy one of the example files in the configs directory (if using a
    git repo) or download one from the
    example configs directory.
  2. Either modify config options in your new file or copy and modify config
    options from mpas_analysis/default.cfg (in a git repo) or directly
    from GitHub:
    default.cfg.
  3. If you installed the mpas-analysis package, run:
    mpas_analysis myrun.cfg. This will read the configuration
    first from mpas_analysis/default.cfg and then replace that
    configuration with any changes from from myrun.cfg
  4. If you want to run a subset of the analysis, you can either set the
    generate option under [output] in your config file or use the
    --generate flag on the command line. See the comments in
    mpas_analysis/default.cfg for more details on this option.

Requirements for custom config files:

  • At minimum you should set baseDirectory under [output] to the folder
    where output is stored. NOTE this value should be a unique
    directory for each run being analyzed. If multiple runs are analyzed in
    the same directory, cached results from a previous analysis will not be
    updated correctly.
  • Any options you copy into the config file must include the
    appropriate section header (e.g. '[run]' or '[output]')
  • You do not need to copy all options from mpas_analysis/default.cfg.
    This file will automatically be used for any options you do not include
    in your custom config file.
  • You should not modify mpas_analysis/default.cfg directly.

List of MPAS output files that are needed by MPAS-Analysis:

  • mpas-o files:
    • mpaso.hist.am.timeSeriesStatsMonthly.*.nc (Note: since OHC
      anomalies are computed wrt the first year of the simulation,
      if OHC diagnostics is activated, the analysis will need the
      first full year of mpaso.hist.am.timeSeriesStatsMonthly.*.nc
      files, no matter what [timeSeries]/startYear and
      [timeSeries]/endYear are. This is especially important to know if
      short term archiving is used in the run to analyze: in that case, set
      [input]/runSubdirectory, [input]/oceanHistorySubdirectory and
      [input]/seaIceHistorySubdirectory to the appropriate run and archive
      directories and choose [timeSeries]/startYear and
      [timeSeries]/endYear to include only data that have been short-term
      archived).
    • mpaso.hist.am.meridionalHeatTransport.0001-03-01.nc (or any
      hist.am.meridionalHeatTransport file)
    • mpaso.rst.0002-01-01_00000.nc (or any other mpas-o restart file)
    • streams.ocean
    • mpaso_in
  • mpas-seaice files:
    • mpasseaice.hist.am.timeSeriesStatsMonthly.*.nc
    • mpasseaice.rst.0002-01-01_00000.nc (or any other mpas-seaice restart
      file)
    • streams.seaice
    • mpassi_in

Note: for older runs, mpas-seaice files will be named:

  • mpascice.hist.am.timeSeriesStatsMonthly.*.nc
  • mpascice.rst.0002-01-01_00000.nc
  • streams.cice
  • mpas-cice_in
    Also, for older runs mpaso_in will be named:
  • mpas-o_in

Purge Old Analysis

To purge old analysis (delete the whole output directory) before running run
the analysis, add the --purge flag. If you installed mpas-analysis as
a package, run:

mpas_analysis --purge <config.file>

All of the subdirectories listed in output will be deleted along with the
climatology subdirectories in oceanObservations and seaIceObservations.

It is a good policy to use the purge flag for most changes to the config file,
for example, updating the start and/or end years of climatologies (and
sometimes time series), changing the resolution of a comparison grid, renaming
the run, changing the seasons over which climatologies are computed for a given
task, updating the code to the latest version.

Cases where it is reasonable not to purge would be, for example, changing
options that only affect plotting (color map, ticks, ranges, font sizes, etc.),
rerunning with a different set of tasks specified by the generate option
(though this will often cause climatologies to be re-computed with new
variables and may not save time compared with purging), generating only the
final website with --html_only, and re-running after the simulation has
progressed to extend time series (however, not recommended for changing the
bounds on climatologies, see above).

Running in parallel via a queueing system

If you are running from a git repo:

  1. If you are running from a git repo, copy the appropriate job script file
    from configs/<machine_name> to the root directory (or another directory
    if preferred). The default script, configs/job_script.default.bash, is
    appropriate for a laptop or desktop computer with multiple cores.
  2. If using the mpas-analysis conda package, download the job script and/or
    sample config file from the
    example configs directory.
  3. Modify the number of parallel tasks, the run name, the output directory
    and the path to the config file for the run.
  4. Note: the number of parallel tasks can be anything between 1 and the
    number of analysis tasks to be performed. If there are more tasks than
    parallel tasks, later tasks will simply wait until earlier tasks have
    finished.
  5. Submit the job using the modified job script

If a job script for your machine is not available, try modifying the default
job script in configs/job_script.default.bash or one of the job scripts for
another machine to fit your needs.

Customizing plots or creating new ones

There are three main ways to either customize the plots that MPAS-Analysis
already makes or creating new ones:

  1. customize the config file. Some features, such as colormaps and colorbar
    limits for color shaded plot or depth ranges for ocean region time series,
    can be customized: look at mpas_analysis/default.cfg for available
    customization for each analysis task.
  2. read in the analysis data computed by MPAS-Analysis into custom scripts. When
    running MPAS-Analysis with the purpose of generating both climatologies
    and time series, the following data sets are generated:
    • [baseDirectory]/clim/mpas/avg/unmasked_[mpasMeshName]: MPAS-Ocean
      and MPAS-seaice climatologies on the native grid.
    • [baseDirectory]/clim/mpas/avg/remapped: remapped climatologies
      for each chosen task (climatology files are stored in different
      subdirectories according to the task name).
    • [baseDirectory]/clim/obs: observational climatologies.
    • [baseDirectory]/clim/mpas/avg/mocStreamfunction_years[startYear]-[endYear].nc.
    • [baseDirectory]/clim/mpas/avg/meridionalHeatTransport_years[startYear]-[endYear].nc.
    • [baseDirectory]/timeseries: various time series data.
      Custom scripts can then utilize these datasets to generate custom plots.
  3. add a new analysis task to MPAS-Analysis (see below).

Instructions for creating a new analysis task

Analysis tasks can be found in a directory corresponding to each component,
e.g., mpas_analysis/ocean for MPAS-Ocean. Shared functionality is contained
within the mpas_analysis/shared directory.

  1. create a new task by copying mpas_analysis/analysis_task_template.py to
    the appropriate folder (ocean, sea_ice, etc.) and modifying it as
    described in the template. Take a look at
    mpas_analysis/shared/analysis_task.py for additional guidance.
  2. note, no changes need to be made to mpas_analysis/shared/analysis_task.py
  3. modify mpas_analysis/default.cfg (and possibly any machine-specific
    config files in configs/<machine>)
  4. import new analysis task in mpas_analysis/<component>/__init__.py
  5. add new analysis task to mpas_analysis/__main__.py under
    build_analysis_list, see below.

A new analysis task can be added with:

   analyses.append(<component>.MyTask(config, myArg='argValue'))

This will add a new object of the MyTask class to a list of analysis tasks
created in build_analysis_list. Later on in run_analysis, it will first
go through the list to make sure each task needs to be generated
(by calling check_generate, which is defined in AnalysisTask), then,
will call setup_and_check on each task (to make sure the appropriate AM is
on and files are present), and will finally call run on each task that is
to be generated and is set up properly.

Generating Documentation

Create a development environment as described above in "Installation for
developers". Then run:
To generate the sphinx documentation, run:

cd docs
make clean
make html

The results can be viewed in your web browser by opening:

_build/html/index.html

Owner metadata


GitHub Events

Total
Last Year

Committers metadata

Last synced: 5 days ago

Total Commits: 2,001
Total Committers: 24
Avg Commits per committer: 83.375
Development Distribution Score (DDS): 0.207

Commits in past year: 216
Committers in past year: 7
Avg Commits per committer in past year: 30.857
Development Distribution Score (DDS) in past year: 0.468

Name Email Commits
Xylar Asay-Davis x****m@g****m 1586
Carolyn Begeman c****n@l****v 78
Phillip J. Wolfram p****m@g****m 54
Milena Veneziani m****a@l****v 46
Darin Comeau d****u@l****v 38
Althea Denlinger a****n@u****u 26
Steven Brus s****s@g****m 24
Luke Van Roekel l****l@g****m 22
Anirban Sinha a****a@g****m 22
Riley X. Brady r****y@c****u 16
Irena Vankova i****k@g****m 15
Elizabeth Hunke e****e@l****v 14
Joseph H Kennedy k****h@o****v 13
Alice Barthel a****l@l****v 13
Stephen Price s****e@l****v 10
Greg Streletz s****z@l****v 9
Adrian Turner a****t@l****v 5
Kevin Rosa k****a@u****u 3
Stephen Price s****e@c****v 2
Charles Doutriaux d****1@l****v 1
Mark Petersen m****n@l****v 1
Baldwin b****2@m****v 1
Adrian K. Turner a****t@p****v 1
Matthew Hoffman m****n@l****v 1

Committer domains:


Issue and Pull Request metadata

Last synced: 1 day ago

Total issues: 337
Total pull requests: 773
Average time to close issues: 4 months
Average time to close pull requests: 13 days
Total issue authors: 25
Total pull request authors: 22
Average comments per issue: 4.47
Average comments per pull request: 6.68
Merged pull request: 739
Bot issues: 0
Bot pull requests: 0

Past year issues: 14
Past year pull requests: 97
Past year average time to close issues: 9 days
Past year average time to close pull requests: 4 days
Past year issue authors: 6
Past year pull request authors: 5
Past year average comments per issue: 1.43
Past year average comments per pull request: 2.99
Past year merged pull request: 91
Past year bot issues: 0
Past year bot pull requests: 0

More stats: https://issues.ecosyste.ms/repositories/lookup?url=https://github.com/mpas-dev/mpas-analysis

Top Issue Authors

  • xylar (197)
  • pwolfram (39)
  • milenaveneziani (29)
  • vanroekel (17)
  • akturner (9)
  • alicebarthel (7)
  • golaz (6)
  • darincomeau (4)
  • matthewhoffman (4)
  • mark-petersen (4)
  • bradyrx (3)
  • chengdang (3)
  • cbegeman (2)
  • ytakano3 (2)
  • kevinrosa (1)

Top Pull Request Authors

  • xylar (653)
  • pwolfram (30)
  • milenaveneziani (21)
  • altheaden (15)
  • vanroekel (12)
  • cbegeman (12)
  • darincomeau (5)
  • akturner (4)
  • gstreletz (3)
  • eclare108213 (3)
  • anirban89 (2)
  • bradyrx (2)
  • irenavankova (2)
  • alicebarthel (1)
  • kevinrosa (1)

Top Issue Labels

  • enhancement (93)
  • bug (91)
  • priority (57)
  • clean up (41)
  • solution under review (30)
  • low priority (26)
  • help wanted (10)
  • solution under development (8)
  • sub-ice-shelf (6)
  • mpas_xarray (5)
  • potential hackathon task (4)
  • wontfix (4)
  • documenation (3)
  • question (3)
  • design document (2)
  • in progress (1)
  • performance (1)

Top Pull Request Labels

  • bug (233)
  • enhancement (229)
  • clean up (185)
  • priority (118)
  • documenation (74)
  • in progress (14)
  • mpas_xarray (13)
  • design document (12)
  • help wanted (6)
  • sub-ice-shelf (6)
  • ci (4)
  • dependencies (4)
  • performance (3)
  • work in progress (3)
  • low priority (2)
  • focus on functionality (2)
  • update public observations (1)
  • design finalized (1)
  • non-workflow (1)

Package metadata

proxy.golang.org: github.com/MPAS-Dev/MPAS-Analysis

proxy.golang.org: github.com/mpas-dev/mpas-analysis

conda-forge.org: mpas-analysis

Analysis for simulations produced with Model for Prediction Across Scales (MPAS) components and the Energy Exascale Earth System Model (E3SM), which used those components.

  • Homepage: https://github.com/MPAS-Dev/MPAS-Analysis
  • Licenses: BSD-3-Clause
  • Latest release: 1.7.2 (published over 2 years ago)
  • Last Synced: 2025-04-01T02:08:40.440Z (27 days ago)
  • Versions: 18
  • Dependent Packages: 0
  • Dependent Repositories: 0
  • Rankings:
    • Forks count: 24.033%
    • Average: 31.165%
    • Stargazers count: 38.297%

Dependencies

setup.py pypi
suite/setup.py pypi

Score: -Infinity