Scores
A Python package of mathematical functions for the verification, evaluation and optimisation of forecasts, predictions or models, primarily supporting the meteorological, climatological and geoscientific communities.
https://github.com/nci/scores
Category: Climate Change
Sub Category: Earth and Climate Modeling
Keywords
climate contingency-table dask forecast-evaluation forecast-verification forecasting model-validation oceanography pandas python verification weather xarray
Keywords from Contributors
measur archiving transforms projection optimize animals generic compose conversion observation
Last synced: about 12 hours ago
JSON representation
Repository metadata
scores: Metrics for the verification, evaluation and optimisation of forecasts, predictions or models.
- Host: GitHub
- URL: https://github.com/nci/scores
- Owner: nci
- License: apache-2.0
- Created: 2023-05-31T23:38:24.000Z (almost 2 years ago)
- Default Branch: develop
- Last Pushed: 2025-04-16T00:09:50.000Z (11 days ago)
- Last Synced: 2025-04-22T07:47:42.931Z (5 days ago)
- Topics: climate, contingency-table, dask, forecast-evaluation, forecast-verification, forecasting, model-validation, oceanography, pandas, python, verification, weather, xarray
- Language: Jupyter Notebook
- Homepage: https://scores.readthedocs.io/
- Size: 18.5 MB
- Stars: 160
- Watchers: 8
- Forks: 32
- Open Issues: 75
- Releases: 23
-
Metadata Files:
- Readme: README.md
- Contributing: docs/contributing.md
- License: LICENSE
- Code of conduct: CODE_OF_CONDUCT.md
- Citation: CITATION.cff
- Security: docs/SECURITY.md
- Zenodo: .zenodo.json
README.md
scores: Verification and Evaluation for Forecasts and Models
A list of over 60 metrics, statistical techniques and data processing tools contained in
scores
is available here.
scores
is a Python package containing mathematical functions for the verification, evaluation and optimisation of forecasts, predictions or models. It supports labelled n-dimensional (multidimensional) data, which is used in many scientific fields and in machine learning. At present, scores
primarily supports the geoscience communities; in particular, the meteorological, climatological and oceanographic communities.
Documentation: scores.readthedocs.io
Source code: github.com/nci/scores
Tutorial gallery: available here
Journal article: scores: A Python package for verifying and evaluating models and predictions with xarray
Overview
Below is a curated selection of the metrics, tools and statistical tests included in scores
. (Click here for the full list.)
Description | Selection of Included Functions | |
---|---|---|
Continuous | Scores for evaluating single-valued continuous forecasts. | MAE, MSE, RMSE, Additive Bias, Multiplicative Bias, Percent Bias, Pearson's Correlation Coefficient, Kling-Gupta Efficiency, Flip-Flop Index, Quantile Loss, Quantile Interval Score, Interval Score, Murphy Score, and threshold weighted scores for expectiles, quantiles and Huber Loss. |
Probability | Scores for evaluating forecasts that are expressed as predictive distributions, ensembles, and probabilities of binary events. | Brier Score, Continuous Ranked Probability Score (CRPS) for Cumulative Density Functions (CDF) and ensembles (including threshold weighted versions), Receiver Operating Characteristic (ROC), Isotonic Regression (reliability diagrams). |
Categorical | Scores for evaluating forecasts of categories. | 18 binary contingency table (confusion matrix) metrics, the FIxed Risk Multicategorical (FIRM) Score, and the SEEPS score. |
Spatial | Scores that take into account spatial structure. | Fractions Skill Score. |
Statistical Tests | Tools to conduct statistical tests and generate confidence intervals. | Diebold Mariano. |
Processing Tools | Tools to pre-process data. | Data matching, Discretisation, Cumulative Density Function Manipulation. |
Emerging | Emerging scores that are still undergoing mathematical peer review. They may change in line with the peer review process. | Risk Matrix Score. |
scores
not only includes common scores (e.g., MAE, RMSE), it also includes novel scores not commonly found elsewhere (e.g., FIRM, Flip-Flop Index), complex scores (e.g., threshold weighted CRPS), and statistical tests (e.g., the Diebold Mariano test). Additionally, it provides pre-processing tools for preparing data for scores in a variety of formats including cumulative distribution functions (CDF). scores
provides its own implementations where relevant to avoid extensive dependencies.
scores
primarily supports xarray datatypes for Earth system data allowing it to work with NetCDF4, HDF5, Zarr and GRIB data formats among others. scores
uses Dask for scaling and performance. Some metrics work with pandas and we aim to expand this capability.
All of the scores and metrics in this package have undergone a thorough scientific and software review. Every score has a companion Jupyter Notebook tutorial that demonstrates its use in practice.
Contributing
To find out more about contributing, see our contributing guide.
All interactions in discussions, issues, emails and code (e.g., pull requests, code comments) will be managed according to the expectations outlined in the code of conduct and in accordance with all relevant laws and obligations. This project is an inclusive, respectful and open project with high standards for respectful behaviour and language. The code of conduct is the Contributor Covenant, adopted by over 40,000 open source projects. Any concerns will be dealt with fairly and respectfully, with the processes described in the code of conduct.
Installation
The installation guide describes four different use cases for installing, using and working with this package.
Most users currently want the all installation option. This includes the mathematical functions (scores, metrics, statistical tests etc.), the tutorial dependencies and development libraries.
# From a local checkout of the Git repository
pip install -e .[all]
To install the mathematical functions ONLY (no tutorial dependencies, no developer libraries), use the default minimal installation option. minimal is a stable version with limited dependencies. This can be installed from the Python Package Index (PyPI) or with conda.
# From PyPI
pip install scores
# From conda-forge
conda install conda-forge::scores
(Note: at present, only the minimal installation option is available from conda. In time, we intend to add more installation options to conda.)
scores
Using Here is a short example of the use of scores
:
> import scores
> forecast = scores.sample_data.simple_forecast()
> observed = scores.sample_data.simple_observations()
> mean_absolute_error = scores.continuous.mae(forecast, observed)
> print(mean_absolute_error)
<xarray.DataArray ()>
array(2.)
Jupyter Notebook tutorials are provided for each metric and statistical test in scores
, as well as for some of the key features of scores
(e.g., dimension handling and weighting results).
To watch a PyCon AU 2024 conference presentation about scores
click here.
Finding, Downloading and Working With Data
All metrics, statistical techniques and data processing tools in scores
work with xarray. Some metrics work with pandas. As such, scores
works with any data source for which xarray or pandas can be used. See the data sources page and this tutorial for more information on finding, downloading and working with different sources of data.
scores
on Zenodo
Archives of scores
is archived on Zenodo. Click here to see the latest version on Zenodo.
scores
Acknowledging or Citing If you use scores
for a published work, we would appreciate you citing our paper:
Leeuwenburg, T., Loveday, N., Ebert, E. E., Cook, H., Khanarmuei, M., Taggart, R. J., Ramanathan, N., Carroll, M., Chong, S., Griffiths, A., & Sharples, J. (2024). scores: A Python package for verifying and evaluating models and predictions with xarray. Journal of Open Source Software, 9(99), 6889. https://doi.org/10.21105/joss.06889
BibTeX:
@article{Leeuwenburg_scores_A_Python_2024,
author = {Leeuwenburg, Tennessee and Loveday, Nicholas and Ebert, Elizabeth E. and Cook, Harrison and Khanarmuei, Mohammadreza and Taggart, Robert J. and Ramanathan, Nikeeth and Carroll, Maree and Chong, Stephanie and Griffiths, Aidan and Sharples, John},
doi = {10.21105/joss.06889},
journal = {Journal of Open Source Software},
month = jul,
number = {99},
pages = {6889},
title = {{scores: A Python package for verifying and evaluating models and predictions with xarray}},
url = {https://joss.theoj.org/papers/10.21105/joss.06889},
volume = {9},
year = {2024}
}
Citation (CITATION.cff)
cff-version: "1.2.0" authors: - family-names: Leeuwenburg given-names: Tennessee orcid: "https://orcid.org/0009-0008-2024-1967" - family-names: Loveday given-names: Nicholas orcid: "https://orcid.org/0009-0000-5796-7069" - family-names: Ebert given-names: Elizabeth E. - family-names: Cook given-names: Harrison orcid: "https://orcid.org/0009-0009-3207-4876" - family-names: Khanarmuei given-names: Mohammadreza orcid: "https://orcid.org/0000-0002-5017-9622" - family-names: Taggart given-names: Robert J. orcid: "https://orcid.org/0000-0002-0067-5687" - family-names: Ramanathan given-names: Nikeeth orcid: "https://orcid.org/0009-0002-7406-7438" - family-names: Carroll given-names: Maree orcid: "https://orcid.org/0009-0008-6830-8251" - family-names: Chong given-names: Stephanie orcid: "https://orcid.org/0009-0007-0796-4127" - family-names: Griffiths given-names: Aidan - family-names: Sharples given-names: John message: If you use this software, please cite our article in the Journal of Open Source Software. title: "scores: A Python package for verifying and evaluating models and predictions with xarray" preferred-citation: authors: - family-names: Leeuwenburg given-names: Tennessee orcid: "https://orcid.org/0009-0008-2024-1967" - family-names: Loveday given-names: Nicholas orcid: "https://orcid.org/0009-0000-5796-7069" - family-names: Ebert given-names: Elizabeth E. - family-names: Cook given-names: Harrison orcid: "https://orcid.org/0009-0009-3207-4876" - family-names: Khanarmuei given-names: Mohammadreza orcid: "https://orcid.org/0000-0002-5017-9622" - family-names: Taggart given-names: Robert J. orcid: "https://orcid.org/0000-0002-0067-5687" - family-names: Ramanathan given-names: Nikeeth orcid: "https://orcid.org/0009-0002-7406-7438" - family-names: Carroll given-names: Maree orcid: "https://orcid.org/0009-0008-6830-8251" - family-names: Chong given-names: Stephanie orcid: "https://orcid.org/0009-0007-0796-4127" - family-names: Griffiths given-names: Aidan - family-names: Sharples given-names: John date-published: 2024-07-09 doi: 10.21105/joss.06889 issn: 2475-9066 issue: 99 journal: Journal of Open Source Software publisher: name: Open Journals start: 6889 title: "scores: A Python package for verifying and evaluating models and predictions with xarray" type: article url: "https://joss.theoj.org/papers/10.21105/joss.06889" volume: 9
Owner metadata
- Name: NCI
- Login: nci
- Email:
- Kind: organization
- Description:
- Website: http://nci.org.au/
- Location: Australian National University, Canberra, Australia
- Twitter:
- Company:
- Icon url: https://avatars.githubusercontent.com/u/18274870?v=4
- Repositories: 4
- Last ynced at: 2023-03-11T03:36:43.281Z
- Profile URL: https://github.com/nci
GitHub Events
Total
- Create event: 6
- Release event: 2
- Issues event: 103
- Watch event: 94
- Delete event: 7
- Member event: 1
- Issue comment event: 418
- Push event: 166
- Pull request review event: 432
- Pull request review comment event: 492
- Pull request event: 143
- Fork event: 15
Last Year
- Create event: 6
- Release event: 2
- Issues event: 103
- Watch event: 94
- Delete event: 7
- Member event: 1
- Issue comment event: 418
- Push event: 166
- Pull request review event: 432
- Pull request review comment event: 492
- Pull request event: 143
- Fork event: 15
Committers metadata
Last synced: 6 days ago
Total Commits: 687
Total Committers: 22
Avg Commits per committer: 31.227
Development Distribution Score (DDS): 0.501
Commits in past year: 545
Committers in past year: 18
Avg Commits per committer in past year: 30.278
Development Distribution Score (DDS) in past year: 0.484
Name | Commits | |
---|---|---|
Tennessee Leeuwenburg | t****g@b****u | 343 |
Stephanie Chong | 1****g | 138 |
Nicholas Loveday | 4****y | 90 |
Nikeeth Ramanathan | n****n@g****m | 17 |
Aidan Griffiths | a****s@b****u | 13 |
Harrison Cook | h****k@b****u | 11 |
Arshia Sharma | a****a@a****u | 8 |
dependabot[bot] | 4****] | 8 |
Deryn | 1****s | 8 |
reza-armuei | 1****i | 7 |
John Sharples | j****s@b****u | 7 |
Maree Carroll | m****l@g****m | 6 |
rob-taggart | 8****t | 5 |
Aidan Griffiths | 5****s | 5 |
arshia | 6****r | 4 |
Liam Bluett | 8****t | 4 |
Beth Ebert | b****t@b****u | 3 |
AJTheDataGuy | 1****y | 2 |
Dougie Squire | 4****e | 2 |
JinghanFu | 1****u | 2 |
Samuel Bishop | l****s@m****m | 2 |
durgals | d****a@g****m | 2 |
Committer domains:
- bom.gov.au: 5
- me.com: 1
- autogeneral.com.au: 1
Issue and Pull Request metadata
Last synced: 1 day ago
Total issues: 357
Total pull requests: 386
Average time to close issues: about 1 month
Average time to close pull requests: 9 days
Total issue authors: 18
Total pull request authors: 20
Average comments per issue: 1.7
Average comments per pull request: 1.87
Merged pull request: 327
Bot issues: 0
Bot pull requests: 5
Past year issues: 217
Past year pull requests: 256
Past year average time to close issues: 28 days
Past year average time to close pull requests: 5 days
Past year issue authors: 12
Past year pull request authors: 16
Past year average comments per issue: 2.19
Past year average comments per pull request: 2.0
Past year merged pull request: 225
Past year bot issues: 0
Past year bot pull requests: 5
Top Issue Authors
- tennlee (120)
- nicholasloveday (95)
- Steph-Chong (48)
- nikeethr (19)
- aidanjgriffiths (18)
- rob-taggart (11)
- savente93 (11)
- calebweinreb (6)
- mareecarroll (5)
- HCookie (5)
- John-Sharples (5)
- durgals (4)
- bomRob (3)
- wuxx66 (2)
- lucyleeow (2)
Top Pull Request Authors
- tennlee (128)
- Steph-Chong (95)
- nicholasloveday (66)
- HCookie (19)
- aidanjgriffiths (18)
- reza-armuei (8)
- durgals (6)
- John-Sharples (6)
- nikeethr (6)
- dependabot[bot] (5)
- lbluett (5)
- arshiaar (5)
- mareecarroll (4)
- andrewdhicks (3)
- JinghanFu (3)
Top Issue Labels
- documentation (40)
- good first issue (36)
- enhancement (32)
- new metric (18)
- refactoring (17)
- question (5)
- coding (4)
- investigation (4)
- hard to fix (2)
- emerging (1)
- help wanted (1)
- duplicate (1)
Top Pull Request Labels
- enhancement (10)
- dependencies (5)
- emerging (3)
- documentation (2)
- wontfix (1)
Package metadata
- Total packages: 1
-
Total downloads:
- pypi: 3,568 last-month
- Total dependent packages: 0
- Total dependent repositories: 2
- Total versions: 24
- Total maintainers: 1
pypi.org: scores
Scores is a Python package containing mathematical functions for the verification, evaluation and optimisation of forecasts, predictions or models.
- Homepage:
- Documentation: https://scores.readthedocs.io/en/stable/
- Licenses: Apache Software License
- Latest release: 2.0.0 (published 5 months ago)
- Last Synced: 2025-04-25T13:33:49.796Z (1 day ago)
- Versions: 24
- Dependent Packages: 0
- Dependent Repositories: 2
- Downloads: 3,568 Last month
-
Rankings:
- Dependent packages count: 9.973%
- Average: 11.102%
- Dependent repos count: 11.632%
- Downloads: 11.703%
- Maintainers (1)
Dependencies
- pip
- actions/checkout v3 composite
- actions/setup-python v3 composite
- bottleneck *
- myst-parser *
- pandas *
- scipy *
- scores *
- sphinx *
- sphinx-book-theme *
- xarray *
Score: 16.731228915096924