A curated list of open technology projects to sustain a stable climate, energy supply, biodiversity and natural resources.

Recent Releases of OasisLMF

OasisLMF - 1.19.0

OasisLMF Changelog - 1.19.0

  • #856 - improve memory usage of fmpy
  • #883 - Fix ri file detection
  • #877 - Feature/873 automated issues tests
  • #832 - Consolidate Arch2020 work

OasisLMF Notes

Improved memory usage for fmpy - (PR #880)

In Fmpy, losses were store in arrays where the index in the array correspond to the sidx id.
with the help of numpy and numba this storage is very efficient for lots of operation needed in the financial module in particular if lots of sidx are present (if the density is high).
However, it present a weakness towards memory consumption when number of sample increase (O(N) complexity)

As in general model requiring high number of sample have loss array of small density one way to decrease memory usage and also increase performances is to use sparse array to store the loss.

this PR introduce a new mode for fmpy, where sparse array are used for the computation. It is activated when the number of sidx is bigger that 16 meaning when the max sidx value is 14 ( + the -3 and -1 sidx makes 16 values)

benchmark result performed on a complex 1000 location portfolio with a model generating loss of 3% density

image

performance are calculated as a number of time faster than the gul

In those condition that are representative of most high sample model,

  • the memory consumption go from around 40MO per 10K samples to 3MO per 10K sample, dividing it by an order of magnitude.
    This is roughly inline with the 3% sparsity (3% of 40 is 1.2) of the loss.
  • for high Sample count, the performances of fmpy are improved by a factor of around 3.

Resolved issue test cases added to regression tests - (PR #877)

Regression tests now include a set of OED test cases linked to specific git issues which have been found and fixed.

Refactor genbash and LossGeneration to chunk events separately - (PR #879)

CLI Example:

NUM_CHUNKS=10
oasislmf model generate-oasis-files --oasis-files-dir  ~/ram/run/input

# run each chunk of events 
for (( i=1; i<=$NUM_CHUNKS; i++ )); do
oasislmf model generate-losses-chunk --max-process-id $NUM_CHUNKS --process-number $i --model-run-dir ~/ram/run --oasis-files-dir  ~/ram/run/input
    echo $i
done

# run output reports 
oasislmf model generate-losses-output --model-run-dir ~/ram/run/ --oasis-files-dir ~/ram/run/input/ 

This creates individual scripts to run each event chunk plus a single output report script

├── 1.run_analysis.sh
├── 2.run_analysis.sh
├── 3.run_analysis.sh
├── 4.run_analysis.sh
├── 5.run_analysis.sh
├── 6.run_analysis.sh
├── 7.run_analysis.sh
├── 8.run_analysis.sh
├── 9.run_analysis.sh
├── 10.run_analysis.sh
├── run_outputs.sh
  ..

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 3 years ago

OasisLMF - 1.15.14

  • #856 - improve memory usage of fmpy

OasisLMF Notes

In Fmpy, losses were store in arrays where the index in the array correspond to the sidx id.
with the help of numpy and numba this storage is very efficient for lots of operation needed in the financial module in particular if lots of sidx are present (if the density is high).
However, it present a weakness towards memory consumption when number of sample increase (O(N) complexity)

As in general model requiring high number of sample have loss array of small density one way to decrease memory usage and also increase performances is to use sparse array to store the loss.

this PR introduce a new mode for fmpy, where sparse array are used for the computation. It is activated when the number of sidx is bigger that 16 meaning when the max sidx value is 14 ( + the -3 and -1 sidx makes 16 values)

benchmark result performed on a complex 1000 location portfolio with a model generating loss of 3% density

image

performance are calculated as a number of time faster than the gul

In those condition that are representative of most high sample model,

  • the memory consumption go from around 40MO per 10K samples to 3MO per 10K sample, dividing it by an order of magnitude.
    This is roughly inline with the 3% sparsity (3% of 40 is 1.2) of the loss.
  • for high Sample count, the performances of fmpy are improved by a factor of around 3.

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 3 years ago

OasisLMF - 1.19.0rc1

OasisLMF Changelog - 1.19.0rc1

  • #856 - improve memory usage of fmpy
  • #883 - Fix ri file detection
  • #877 - Feature/873 automated issues tests
  • #832 - Consolidate Arch2020 work

OasisLMF Notes

In Fmpy, losses were store in arrays where the index in the array correspond to the sidx id.
with the help of numpy and numba this storage is very efficient for lots of operation needed in the financial module in particular if lots of sidx are present (if the density is high).
However, it present a weakness towards memory consumption when number of sample increase (O(N) complexity)

As in general model requiring high number of sample have loss array of small density one way to decrease memory usage and also increase performances is to use sparse array to store the loss.

this PR introduce a new mode for fmpy, where sparse array are used for the computation. It is activated when the number of sidx is bigger that 16 meaning when the max sidx value is 14 ( + the -3 and -1 sidx makes 16 values)

benchmark result performed on a complex 1000 location portfolio with a model generating loss of 3% density

image

performance are calculated as a number of time faster than the gul

In those condition that are representative of most high sample model,

  • the memory consumption go from around 40MO per 10K samples to 3MO per 10K sample, dividing it by an order of magnitude.
    This is roughly inline with the 3% sparsity (3% of 40 is 1.2) of the loss.
  • for high Sample count, the performances of fmpy are improved by a factor of around 3.

Resolved issue test cases added to regression tests - (PR #877)

Regression tests now include a set of OED test cases linked to specific git issues which have been found and fixed.

Refactor genbash and LossGeneration to chunk events separately - (PR #879)

CLI Example:

NUM_CHUNKS=10
oasislmf model generate-oasis-files --oasis-files-dir  ~/ram/run/input

# run each chunk of events 
for (( i=1; i<=$NUM_CHUNKS; i++ )); do
oasislmf model generate-losses-chunk --max-process-id $NUM_CHUNKS --process-number $i --model-run-dir ~/ram/run --oasis-files-dir  ~/ram/run/input
    echo $i
done

# run output reports 
oasislmf model generate-losses-output --model-run-dir ~/ram/run/ --oasis-files-dir ~/ram/run/input/ 

This creates individual scripts to run each event chunk plus a single output report script

├── 1.run_analysis.sh
├── 2.run_analysis.sh
├── 3.run_analysis.sh
├── 4.run_analysis.sh
├── 5.run_analysis.sh
├── 6.run_analysis.sh
├── 7.run_analysis.sh
├── 8.run_analysis.sh
├── 9.run_analysis.sh
├── 10.run_analysis.sh
├── run_outputs.sh
  ..

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 3 years ago

OasisLMF - 1.18.0

OasisLMF Changelog - 1.18.0

  • #865 - correction for PolDed6All fields
  • #861 - Add PALT to genbash
  • #820 - Pol Fac Contracts
  • #837 - added new doc
  • #869 - Feature/761 fm tests
  • #862 - oasislmf test fm doesn't work and there is no documentation
  • #828 - RI FM File Generation issue when locations have zero TIV
  • #850 - Improve exit handler's process cleanup
  • #874 - Check and set minimum versions for optional packages
  • #846 - Update package dependencies for flexible lookup code
  • #852 - Issue with custom lookup_module_path relative path
  • #826 - Automate Change logs and release notes
  • #860 - slight interface changes for key server functions

OasisLMF Notes

FM supported fields documentation correction - (PR #865)

PolDed6All, PolDedType6All, PolMinDed6All, PolMaxDed6All marked as supported in documentation

Enabled the PALT ORD outputs - (PR #866)

Updated Bash script to create palt.csv ORD files when selected in the analysis_settings file.

Example analysis_settings

    "gul_summaries": [
        {   
            "id": 1,
            "ord_output": {
                "alt_period": true
            }   
        }   
    ], 

New reports

$ tree output/
output/
├── analysis_settings.json
├── gul_S1_palt.csv
└── gul_S1_summary-info.csv

Introduce Vectorisation in generation of Reinsurance FM Files - (PR #868)

The introduction of vectorisation, and resultant dropping of the tree structure, should lead to an improvement in performance in the generation of reinsurance FM files.

Added documentation on supported OED financial fields - (PR #837)

Some new documentation on what OED financial fields are supported and an overview of how they are applied in the Oasis kernel

Maintenance upgrade of fm validation tests - (PR #869)

Added scripts to manage test cases and align file formats in preparation for OED schema update

Fixed run acceptance test CLI command - (PR #871)

  • Fixed command to work run from the 'current working directory' if its a valid fm test or holds sub directories of fm test cases.
  • Update help text with examples
$ oasislmf test fm --help
usage: oasislmf test fm [-h] [-V] [-C CONFIG] [-c TEST_CASE_NAME] [-l] [-t TEST_CASE_DIR] [-r RUN_DIR] [-p NUM_SUBPERILS] [--test-tolerance TEST_TOLERANCE] [--fmpy [FMPY]] [--fmpy-low-memory [FMPY_LOW_MEMORY]] [--fmpy-sort-output [FMPY_SORT_OUTPUT]]

Run FM acceptance tests from "oasisLMF/validation/*"

    Example use: 
    1. Run all test cases: "oasisLMF/validation$ oasislmf test fm"
        Stating oasislmf command - RunFmTest
        RUNNING: oasislmf.manager.interface
        Running: 4 Tests from '/home/sam/repos/core/oasisLMF/validation'
        Test names: ['insurance', 'insurance_step', 'reinsurance1', 'reinsurance2']
         ...

    2. Run Directly from a acceptance test Directory: "oasisLMF/validation/reinsurance1$ oasislmf test fm"
        Stating oasislmf command - RunFmTest
        RUNNING: oasislmf.manager.interface
         ... 

    3. Select test case sub-directory: "oasisLMF/validation$ oasislmf test fm --test-case-name reinsurance1"
        Stating oasislmf command - RunFmTest
        RUNNING: oasislmf.manager.interface
         ...

Fixed issue when running reinsurance with a location file that has zero TIV rows - (PR #872)

  • Add loc_idx column directly after loading in the location.csv file and before any rows get dropped

Updated bash exit handler - (PR #870)

  • Kill errant processes in reverse order
  • Added inherit_errexit option to run_ktools scripts

Add optional package dependencies for an oasislmf installation - (PR #847)

Added a new dependency file optional-package.in which lists packages needed for the builtin lookup class and extra utility commands. These won't install by default so if the host system is missing the C++ library libspatialindex the oasislmf package will still install, but without the builtin lookup functionality.

To install oasislmf with all the extra dependencies, use pip install oasislmf[extra]

Release notes feature title - (PR #854)

  1. fix the relative path reference for lookup_module_path from execution path to model config path
  2. allow lookup class to be specify using the python environment
    using 'lookup_module' will load the module and use the class named f'{self.config['model']['model_id']}KeysLookup'
    using 'lookup_class' will load and return the class from the env.
    "lookup_class": "jba_fly_oasis.lookup.MySpecialLookup" will do like "from jba_fly_oasis.lookup import MySpecialLookup"

Automated changelogs and release notes - (PR #826)

The build system will now automatically create changelogs and release notes

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild almost 4 years ago

OasisLMF - 1.15.13

OasisLMF Changelog - 1.15.13

  • #828 - RI FM File Generation issue when locations have zero TIV
  • #850 - Improve exit handler's process cleanup
  • #820 - Pol Fac Contracts

OasisLMF Notes

Fixed issue when running reinsurance with a location file that has zero TIV rows - (PR #872)

  • Add loc_idx column directly after loading in the location.csv file and before any rows get dropped

Updated bash exit handler - (PR #870)

  • Kill errant processes in reverse order
  • Added inherit_errexit option to run_ktools scripts

Introduce Vectorisation in generation of Reinsurance FM Files - (PR #868)

The introduction of vectorisation, and resultant dropping of the tree structure, should lead to an improvement in performance in the generation of reinsurance FM files.

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild almost 4 years ago

OasisLMF - 1.17.0

OasisLMF Changelog - 1.17.0

  • #833 - Minor issue - error parsing fm CLI help text
  • #831 - Add timestamps to logger
  • #844 - Built-in Lookup revamp
  • #840 - Fix flaky tests in model_preparation/test_lookup.py
  • #843 - Fix CI pipeline error, double tagging release causes script to fail
  • #842 - Produce summary index files by default to reduce memory use of ktools output components
  • #816 - Issue in coverage_type_id grouping of ground up loss results for multi-peril
  • #809 - Error handling for invalid oasislmf.json config files
  • #849 - Fix CVE-2021-33503
  • #821 - Add missing items to data settings
  • #824 - Inputs directory preparation issue when ORD is enabled
  • #826 - Automate Change logs and release notes
  • #822 - Ktools exit handler killing off bash logging on exit

OasisLMF Notes

Added function timestamps to 'info' log level - (PR #835)

Log execution times If a function takes longer than 0.01 sec to complete

Example Output

COMPLETED: oasislmf.execution.bin.csv_to_bin in 0.02s
RUNNING: oasislmf.execution.bin.csv_to_bin
COMPLETED: oasislmf.execution.bin.csv_to_bin in 0.01s
RUNNING: oasislmf.execution.bin.prepare_run_inputs
RUNNING: oasislmf.execution.runner.run
[OK] eve
[OK] getmodel
[OK] gulcalc
[OK] summarycalc
[OK] eltcalc
[OK] aalcalc
[OK] leccalc
Run Completed
Run Completed

COMPLETED: oasislmf.execution.runner.run in 4.25s

Release notes feature title - (PR #836)

Built-in Lookup class that implement the OasisLookupInterface
provide a data driven lookup capability that will be both flexible (sub-classing capability) and efficient.
lookup function provided include:
    - Location preparation (set default min and max value)
    - 2 geospacial option (direct computation or using rtree)
    - merge to map column keys to an id
    - split_loc_perils_covered to properly use data from LocPerilsCovered
    - simple_pivot allow to pivot columns of the locations dataframe into multiple rows
example of config provided in Piwind lookup_new_key_server.json

Set summarycalc option to output summary index files in genbash - (PR #845)

The ktools binary summarycalc can now produce index files, which leads to a reduction in memory use by the downstream output components leccalc and ordleccalc. To take advantage of this, the option to output these summary index files is set as the default in genbash.

Fixed ordering of item_id column for gulsummaryxref - (PR #817)

The issue was introduced in 1.15.2 from https://github.com/OasisLMF/OasisLMF/pull/774

Before this change, the 'gul' summary groups were created using 'gul_summary_map.csv', but its now using 'fm_summary_map.csv' to include the IL fields.

The problem happens because the summary group code expects that the column id_set_index is sorted before running the grouping, however these are different columns between, gul and fm and fm_summary_map defaults to sorting by output_id

id_set_index columns

gul == item_id
fm == output_id

fm_summary_map

   acc_idx accnumber  agg_id  coverage_id  coverage_type_id  layer_id  loc_id  loc_idx locnumber peril_id polnumber portnumber         tiv  output_id  item_id
0        0      PPT3       1            1                 1         1       1        0      PPT3      WTC      PPT3  T20100_C1  10000000.0          1        1   
1        0      PPT3       3            1                 1         1       1        0      PPT3      WSS      PPT3  T20100_C1  10000000.0          2        3   
2        0      PPT3       2            2                 3         1       1        0      PPT3      WTC      PPT3  T20100_C1  20000000.0          3        2   
3        0      PPT3       4            2                 3         1       1        0      PPT3      WSS      PPT3  T20100_C1  20000000.0          4        4

Updated model settings schema to extend data settings - (PR #821)

  • docs_version and test_files_version: as a way to track metadata for model specific documentation and test files versions
  • keys_data_version: some model have their keys_data included in model_data, in which case item could be empty
  • additional_assets.deploy: needed to target specific deployments (default value for all is "always")

Fixed issue creating occurrence.bin when using ord options in analysis settings. - (PR #825)

Using only ord output options skipped occurrence file creation because the binary preparation code was keyed to check for
"lec_output": true, which is not needed for ord settings. The following JSON now correctly prepares an occurrence.bin file. See issue #824 for details

"gul_summaries": [
    {
        "id": 1,
        "ord_output": {
            "ept_full_uncertainty_aep": true,
            "return_period_file": true
        }
    }
],

Automated changelogs and release notes - (PR #826)

The build system will now automatically create changelogs and release notes

Updated bash exit handler to ignore logging commands - (PR #827)

Fix for issue #822, with this update the exit handler will not kill the bash logging command on exit.

Example of issue

/data/output/run/run_ktools.sh: line 13: 73690 Killed                  tee -ia log/bash.log
/data/output/run/run_ktools.sh: line 14: 73692 Killed                  tee -ia log/bash.log 1>&2

After fix

$cat log/subprocess_list 
    PID TTY      STAT   TIME COMMAND
  19325 pts/4    Ss     0:00 bash
  35622 pts/4    Sl     0:00  \_ /usr/bin/python -O /usr/bin/ranger
  35688 pts/4    S      0:00      \_ /bin/bash
  40382 pts/4    S+     0:00          \_ /bin/bash ./run_ktools.sh
  40387 pts/4    S+     0:00              \_ /bin/bash ./run_ktools.sh
  40389 pts/4    S+     0:00              |   \_ tee -ia log/bash.log
  40388 pts/4    S+     0:00              \_ /bin/bash ./run_ktools.sh
  40390 pts/4    S+     0:00              |   \_ tee -ia log/bash.log
  41134 pts/4    R+     0:00              \_ ps f -g 19325
  35792 pts/4    S      0:00 /bin/bash ./run_ktools.sh
  35791 pts/4    S      0:00 /bin/bash ./run_ktools.sh
  35781 pts/4    S      0:00 /bin/bash ./run_ktools.sh

$cat log/subprocess_list | egrep -v *\\.log$ |  egrep -v *\\.sh$
    PID TTY      STAT   TIME COMMAND
  19325 pts/4    Ss     0:00 bash
  35622 pts/4    Sl     0:00  \_ /usr/bin/python -O /usr/bin/ranger
  35688 pts/4    S      0:00      \_ /bin/bash
  41134 pts/4    R+     0:00              \_ ps f -g 19325

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild almost 4 years ago

OasisLMF - 1.15.10

OasisLMF Changelog - 1.15.10

  • #822 - Ktools exit handler killing off bash logging on exit
  • #843 - Fix CI pipeline error, double tagging release causes script to fail

OasisLMF Notes

Updated bash exit handler to ignore logging commands - (PR #827)

Fix for issue #822, with this update the exit handler will not kill the bash logging command on exit.

Example of issue

/data/output/run/run_ktools.sh: line 13: 73690 Killed                  tee -ia log/bash.log
/data/output/run/run_ktools.sh: line 14: 73692 Killed                  tee -ia log/bash.log 1>&2

After fix

$cat log/subprocess_list 
    PID TTY      STAT   TIME COMMAND
  19325 pts/4    Ss     0:00 bash
  35622 pts/4    Sl     0:00  \_ /usr/bin/python -O /usr/bin/ranger
  35688 pts/4    S      0:00      \_ /bin/bash
  40382 pts/4    S+     0:00          \_ /bin/bash ./run_ktools.sh
  40387 pts/4    S+     0:00              \_ /bin/bash ./run_ktools.sh
  40389 pts/4    S+     0:00              |   \_ tee -ia log/bash.log
  40388 pts/4    S+     0:00              \_ /bin/bash ./run_ktools.sh
  40390 pts/4    S+     0:00              |   \_ tee -ia log/bash.log
  41134 pts/4    R+     0:00              \_ ps f -g 19325
  35792 pts/4    S      0:00 /bin/bash ./run_ktools.sh
  35791 pts/4    S      0:00 /bin/bash ./run_ktools.sh
  35781 pts/4    S      0:00 /bin/bash ./run_ktools.sh

$cat log/subprocess_list | egrep -v *\\.log$ |  egrep -v *\\.sh$
    PID TTY      STAT   TIME COMMAND
  19325 pts/4    Ss     0:00 bash
  35622 pts/4    Sl     0:00  \_ /usr/bin/python -O /usr/bin/ranger
  35688 pts/4    S      0:00      \_ /bin/bash
  41134 pts/4    R+     0:00              \_ ps f -g 19325

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild almost 4 years ago

OasisLMF - 1.17.0rc1

Changelog - 1.17.0rc1

  • #833 - Minor issue - error parsing fm CLI help text
  • #831 - Add timestamps to logger
  • #844 - Built-in Lookup revamp
  • #840 - Fix flaky tests in model_preparation/test_lookup.py
  • #843 - Fix CI pipeline error, double tagging release causes script to fail
  • #842 - Produce summary index files by default to reduce memory use of ktools output components
  • #816 - Issue in coverage_type_id grouping of ground up loss results for multi-peril
  • #809 - Error handling for invalid oasislmf.json config files
  • #821 - Add missing items to data settings
  • #824 - Inputs directory preparation issue when ORD is enabled
  • #826 - Automate Change logs and release notes
  • #822 - Ktools exit handler killing off bash logging on exit

OasisLMF Notes

Added function timestamps to 'info' log level

Log execution times If a function takes longer than 0.01 sec to complete

Example Output

COMPLETED: oasislmf.execution.bin.csv_to_bin in 0.02s
RUNNING: oasislmf.execution.bin.csv_to_bin
COMPLETED: oasislmf.execution.bin.csv_to_bin in 0.01s
RUNNING: oasislmf.execution.bin.prepare_run_inputs
RUNNING: oasislmf.execution.runner.run
[OK] eve
[OK] getmodel
[OK] gulcalc
[OK] summarycalc
[OK] eltcalc
[OK] aalcalc
[OK] leccalc
Run Completed
Run Completed

COMPLETED: oasislmf.execution.runner.run in 4.25s

Release notes feature title

Built-in Lookup class that implement the OasisLookupInterface
provide a data driven lookup capability that will be both flexible (sub-classing capability) and efficient.
lookup function provided include:
    - Location preparation (set default min and max value)
    - 2 geospacial option (direct computation or using rtree)
    - merge to map column keys to an id
    - split_loc_perils_covered to properly use data from LocPerilsCovered
    - simple_pivot allow to pivot columns of the locations dataframe into multiple rows
example of config provided in Piwind lookup_new_key_server.json

Set summarycalc option to output summary index files in genbash

The ktools binary summarycalc can now produce index files, which leads to a reduction in memory use by the downstream output components leccalc and ordleccalc. To take advantage of this, the option to output these summary index files is set as the default in genbash.

Fixed ordering of item_id column for gulsummaryxref

The issue was introduced in 1.15.2 from https://github.com/OasisLMF/OasisLMF/pull/774

Before this change, the 'gul' summary groups were created using 'gul_summary_map.csv', but its now using 'fm_summary_map.csv' to include the IL fields.

The problem happens because the summary group code expects that the column id_set_index is sorted before running the grouping, however these are different columns between, gul and fm and fm_summary_map defaults to sorting by output_id

id_set_index columns

gul == item_id
fm == output_id

fm_summary_map

   acc_idx accnumber  agg_id  coverage_id  coverage_type_id  layer_id  loc_id  loc_idx locnumber peril_id polnumber portnumber         tiv  output_id  item_id
0        0      PPT3       1            1                 1         1       1        0      PPT3      WTC      PPT3  T20100_C1  10000000.0          1        1   
1        0      PPT3       3            1                 1         1       1        0      PPT3      WSS      PPT3  T20100_C1  10000000.0          2        3   
2        0      PPT3       2            2                 3         1       1        0      PPT3      WTC      PPT3  T20100_C1  20000000.0          3        2   
3        0      PPT3       4            2                 3         1       1        0      PPT3      WSS      PPT3  T20100_C1  20000000.0          4        4

Updated model settings schema to extend data settings

  • docs_version and test_files_version: as a way to track metadata for model specific documentation and test files versions
  • keys_data_version: some model have their keys_data included in model_data, in which case item could be empty
  • additional_assets.deploy: needed to target specific deployments (default value for all is "always")

Fixed issue creating occurrence.bin when using ord options in analysis settings.

Using only ord output options skipped occurrence file creation because the binary preparation code was keyed to check for
"lec_output": true, which is not needed for ord settings. The following JSON now correctly prepares an occurrence.bin file. See issue #824 for details

"gul_summaries": [
    {
        "id": 1,
        "ord_output": {
            "ept_full_uncertainty_aep": true,
            "return_period_file": true
        }
    }
],

Automated changelogs and release notes

The build system will now automatically create changelogs and release notes

Updated bash exit handler to ignore logging commands

Fix for issue #822, with this update the exit handler will not kill the bash logging command on exit.

Example of issue

/data/output/run/run_ktools.sh: line 13: 73690 Killed                  tee -ia log/bash.log
/data/output/run/run_ktools.sh: line 14: 73692 Killed                  tee -ia log/bash.log 1>&2

After fix

$cat log/subprocess_list 
    PID TTY      STAT   TIME COMMAND
  19325 pts/4    Ss     0:00 bash
  35622 pts/4    Sl     0:00  \_ /usr/bin/python -O /usr/bin/ranger
  35688 pts/4    S      0:00      \_ /bin/bash
  40382 pts/4    S+     0:00          \_ /bin/bash ./run_ktools.sh
  40387 pts/4    S+     0:00              \_ /bin/bash ./run_ktools.sh
  40389 pts/4    S+     0:00              |   \_ tee -ia log/bash.log
  40388 pts/4    S+     0:00              \_ /bin/bash ./run_ktools.sh
  40390 pts/4    S+     0:00              |   \_ tee -ia log/bash.log
  41134 pts/4    R+     0:00              \_ ps f -g 19325
  35792 pts/4    S      0:00 /bin/bash ./run_ktools.sh
  35791 pts/4    S      0:00 /bin/bash ./run_ktools.sh
  35781 pts/4    S      0:00 /bin/bash ./run_ktools.sh

$cat log/subprocess_list | egrep -v *\\.log$ |  egrep -v *\\.sh$
    PID TTY      STAT   TIME COMMAND
  19325 pts/4    Ss     0:00 bash
  35622 pts/4    Sl     0:00  \_ /usr/bin/python -O /usr/bin/ranger
  35688 pts/4    S      0:00      \_ /bin/bash
  41134 pts/4    R+     0:00              \_ ps f -g 19325

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild almost 4 years ago

OasisLMF - 1.15.9

Changelog - 1.15.9

  • #833 - Minor issue - error parsing fm CLI help text
  • #831 - Add timestamps to logger
  • #840 - Fix flaky tests in model_preparation/test_lookup.py
  • #821 - Add missing items to data settings
  • #826 - Automate Change logs and release notes

OasisLMF Notes

Added function timestamps to 'info' log level

Log execution times If a function takes longer than 0.01 sec to complete

Example Output

COMPLETED: oasislmf.execution.bin.csv_to_bin in 0.02s
RUNNING: oasislmf.execution.bin.csv_to_bin
COMPLETED: oasislmf.execution.bin.csv_to_bin in 0.01s
RUNNING: oasislmf.execution.bin.prepare_run_inputs
RUNNING: oasislmf.execution.runner.run
[OK] eve
[OK] getmodel
[OK] gulcalc
[OK] summarycalc
[OK] eltcalc
[OK] aalcalc
[OK] leccalc
Run Completed
Run Completed

COMPLETED: oasislmf.execution.runner.run in 4.25s

Updated model settings schema to extend data settings

  • docs_version and test_files_version: as a way to track metadata for model specific documentation and test files versions
  • keys_data_version: some model have their keys_data included in model_data, in which case item could be empty
  • additional_assets.deploy: needed to target specific deployments (default value for all is "always")

Automated changelogs and release notes

The build system will now automatically create changelogs and release notes

Climate Change - Natural Hazard and Storm - Python
Published by sambles almost 4 years ago

OasisLMF - 1.15.8

  • #816 - Fix ordering of item_id column for gulsummaryxref
  • #814 - Fix back allocation child loss loop
  • #807 - Fixed fmpy numerical errors for step policies producing gross > ground up

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 4 years ago

OasisLMF - 1.16.0

  • #669 - Revamp of the Key service for improved performance (PR-792)
  • #802 - Fix for null loss in max deductible case
  • #766 - Updated FM python documentation
  • #753 - Added ORD output options for ept/psept and updated json schema
  • #814 - Fix back allocation child loss loop
  • #815 - Update requirements and set tests to Python3.8
  • #806 - Store analysis run settings to outputs via the MDK
  • #807 - Fixed fmpy numerical errors for step policies producing gross > ground up

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 4 years ago

OasisLMF - 1.16.0rc2

  • Updated Ktools to v3.6.0-rc3

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 4 years ago

OasisLMF - 1.15.6

  • #803 - Hotfix - Partial fix for Max Ded back allocation in fmpy

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 4 years ago

OasisLMF - 1.16.0rc1

  • #669 - Revamp of the Key service for improved performance (PR-792)
  • #802 - Fix for null loss in max deductible case
  • #766 - Updated FM python documentation
  • #753 - Added ORD output options for ept/psept and updated json schema

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 4 years ago

OasisLMF - 1.15.5

Hotfix 1.15.5

  • #798 - Hotfix - Fix process cleanup on ktools script exit
  • #799 - Hotfix - Fix fmpy, multilayer stream writer for RI
  • #794 - Hotfix - Fix column duplication when using "tiv, loc_id, coverage_type_id" in oed_field

Hotfix 1.15.4

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 4 years ago

OasisLMF - 1.15.3

  • #780 - Fix for fmpy, last event missing when using event terminator 0,0

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 4 years ago

OasisLMF - 1.15.2

  • #777 - Summarise ground up only runs using Insured loss fields
  • #776 - Fix tiv summary info feature, Pandas compatibility & column select issue

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 4 years ago

OasisLMF - 1.15.1

  • #771 - Hotfix - Fix running GUL only analyses with fmpy (Only in 1.15.1)
  • #765 - Add a first pass of FM python documentation
  • #770 - Fix issue in lookup factory no results check
  • #755 - Added updates fixes to fm testing tool
  • #759 - Switched fmpy to the default financial module
  • #688 - Added TIV reporting to summary info files
  • #623 - Added check to raise an error if a locations file references account numbers missing from the account file.
  • #749 - The Group ids can now be set by the following internal oasis fields 'item_id', 'peril_id', 'coverage_id', and 'coverage_type_id'
  • #760 - Upgraded test harness for financial module and added numerical tests for fmpy.
  • #754 - Added validation for unsupported special conditions
  • #752 - Fixed issue with fmpy - not calculating net loss across all layers correctly.
  • #751 - Remove dependence on ReinsNumber order when assigning layer ID
  • #763 - Dropped binary wheel package for Mac OSX
  • #750 - Switched oasislmf exposure run to use gul stream type 2 by default
  • #391 - Added fix so error is raise when no data is returned from keys server.

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 4 years ago

OasisLMF - 1.15.0

  • #765 - Add a first pass of FM python documentation
  • #770 - Fix issue in lookup factory no results check
  • #755 - Added updates fixes to fm testing tool
  • #759 - Switched fmpy to the default financial module
  • #688 - Added TIV reporting to summary info files
  • #623 - Added check to raise an error if a locations file references account numbers missing from the account file.
  • #749 - The Group ids can now be set by the following internal oasis fields 'item_id', 'peril_id', 'coverage_id', and 'coverage_type_id'
  • #760 - Upgraded test harness for financial module and added numerical tests for fmpy.
  • #754 - Added validation for unsupported special conditions
  • #752 - Fixed issue with fmpy - not calculating net loss across all layers correctly.
  • #751 - Remove dependence on ReinsNumber order when assigning layer ID
  • #763 - Dropped binary wheel package for Mac OSX
  • #750 - Switched oasislmf exposure run to use gul stream type 2 by default
  • #391 - Added fix so error is raise when no data is returned from keys server.

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 4 years ago

OasisLMF - 1.14.0

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 4 years ago

OasisLMF - 1.13.2

  • #690 - Raise error is output node is missing its output_id
  • #700 - Update error guard to cover all ktools binaries
  • #701 - Fixed api search crash when metadata is empty
  • #702 - Fixed error When Input data contains commas
  • #705 - Select keys generator based on class type
  • #710 - Added missing layer calcrules for limit only
  • #712 - Fix missing gul_errors_map.csv file
  • #713 - Fix for gul_errors_map containing duplicate columns
  • #722 - Fixed error creating summary levels file with Pandas 1.2.0

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 4 years ago

OasisLMF - 1.13.1

  • #709 - Fix issue with generation of profile IDs for step policies that include separate coverages

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 4 years ago

OasisLMF - 1.13.0

  • #694 - Schema update, restrict output options by eventset
  • #670 - Add CLI flags for lookup multiprocessing options
  • #695 - Set default value of optional OED step policy fields to 0
  • #686 - Fixed fmpy numerical issues when using allocrule 1
  • #681 - Added fmpy support for stepped policies
  • #680 - Added user defined return periods option to analysis_settings.json
  • #677 - Enabled Fmpy to handle multiple input streams
  • #678 - Fixed environment variable loading

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 4 years ago

OasisLMF - 1.12.1

  • #674 - Introduce check for step policies in genbash

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 4 years ago

OasisLMF - 1.12.0

#413 - Peril Handling in Input Generation
#661 - Added experimental financial module written in Python 'fmpy'
#662 - Define relationships between event and occurrence in model_settings
#671 - Fix issue with loading booleans in oasislmf.json and corrected the 'ktools-fifo-relative' flag
#666 - Fix files created before generate-oasis-files, being cleared

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 4 years ago

OasisLMF - 1.11.1

Hotfix 1.11.1 for

#653 - Fix pre-analysis exposure modification for generate-oasis-files command

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 4 years ago

OasisLMF - 1.11.0

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 4 years ago

OasisLMF - 1.10.2

  • 8fdc512 - Fix issue with introduction of erroneous duplicate rows when calculating aggregated TIVs

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 4 years ago

OasisLMF - 1.10.1

  • Fix issue with supplier model runner

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 4 years ago

OasisLMF - 1.10.0

  • #573 - Extract and apply default values for OED mapped FM terms
  • #581 - Split calc. rules files
  • #604 - Include unsupported coverages in type 2 financial terms calculation
  • #608 - Integration of GUL-FM load balancer
  • #614 - Refactor oasislmf package
  • #617 - Fix TypeError in write_exposure_summary
  • #636 - Improve error handling in run_ktools.sh
  • Fix minor issues in oasis file generation and complex model runs

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 4 years ago

OasisLMF - 1.9.1

Hotfix release (1.9.1)

  • #630 - full_correlation gulcalc option creates large output files.

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 4 years ago

OasisLMF - 1.9.0

  • #566 - Handle unlimited LayerLimit without large default value
  • #574 - Use LayerNumber to identify unique policy layers in gross fm file generation
  • #578 - Added missing combination of terms in calcrules
  • #603 - Add type 2 financial terms tests for multi-peril to regression test
  • PR 600 - Added Scripts for generated example model data for testing.

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild almost 5 years ago

OasisLMF - 1.8.3

  • #601 - Fix calculations for type 2 deductibles and limits in multi-peril models

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild almost 5 years ago

OasisLMF - 1.8.2

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild almost 5 years ago

OasisLMF - 1.8.1

  • #589 - Schema fix to allow for 0 samples
  • #583 - Reduce memory use in gul_inputs creation (DanielFEvans)
  • #582 - Check for calc_type in all sections

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild almost 5 years ago

OasisLMF - 1.8.0

  • #579 - Install complex_itemstobin and complex_itemstocsv
  • #565 - Non-unicode CSV data is not handled neatly
  • #570 - Issue with item_id to from_agg_id mapping at level 1
  • #556 - review calc_rules.csv mapping for duplicates and logical gaps
  • #549 - Add FM Tests May 2020
  • #555 - Add JSON schema validation on CLI
  • #577 - Add api client progressbars for OasisAtScale

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild almost 5 years ago

OasisLMF - 1.7.1

Issues Fixed

  • #553 - calc rule mapping error
  • #550 - genbash fmcalc and full_correlation
  • #548 - GUL Alloc Rule 0
  • #437 - Invalid coverage file generated
  • default run_dir in oasislmf model generate-exposure-pre-analysis command

Notes

New stream type flag on loss generation commands

The previous flag for selecting the deprecated GUL stream type (items/coverage) --ktools-alloc-rule-gul 0 has been replaced with --ktools-legacy-gul-stream.

  • --ktools-alloc-rule-gul 0 - changes the back allocation for ground up loss to reduce the RAM usage.
  • --ktools-alloc-rule-gul 1 - Unchanged from 1.7.0
  • --ktools-legacy-gul-stream - Disabled the ground up loss allocation rule and switches the output to the older stream type.

Duplicate location rows are dropped

Added handing so that when duplicated loc_ids are found the MDK sends a warning the to user.

WARNING: Duplicate keys ['portnumber', 'accnumber', 'locnumber'] detected in location file
     oasislmf doesn't currently support multiple terms for a single location
     dropping the following row(s): [6, 7]

This is a placeholder fix to prevent invalid coverage file generation until we decide how to handle multiple terms per location.

location files now indexed by sorted values

The internal location row identifier loc_id is now assigned based on column values rather than the order a row appears in the location file.

Example

The two locations files 1 and 2 are identical except for row ordering, this produces oasis files with differing item ids so jumbles the output.
This change ensures both files produce identical output.

OED Location File 1

PortNumber  AccNumber  LocNumber    ... 
1           A11111     10002082046  ... 
1           A11111     10002082047  ... 
1           A11111     10002082048  ... 
1           A11111     10002082049  ... 
1           A11111     10002082050  ... 

OED Location File 2

PortNumber  AccNumber  LocNumber    ... 
1           A11111     10002082049  ... 
1           A11111     10002082050  ... 
1           A11111     10002082046  ... 
1           A11111     10002082047  ... 
1           A11111     10002082048  ... 

Internal DataFrame (from File 2)

index  portnumber accnumber    locnumber    loc_id
0       1         A11111       10002082049  4   
1       1         A11111       10002082050  5   
2       1         A11111       10002082046  1   
3       1         A11111       10002082047  2   
4       1         A11111       10002082048  3   

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild almost 5 years ago

OasisLMF - 1.7.0

New features

#528 - FM validation tests with % damage range
#531 - item file ordering of item_id
#533 - new FM acceptance tests
#536 - extend calcrules
#497 - Add exception wrapping to OasisException
#535 - Pre-analysis exposure modification (CLI interface)
#530 - revamped CLI Structure

Notes

Pre-analysis exposure modification

Added a pre-analysis hook for exposure modification which processes the OED input files via custom python code.
Example use cases are geo-coding, exposure enhancement, or dis-aggregation.

Its invoked from either the new sub command oasislmf model generate-exposure-pre-analysis, or is called automatically via oasislmf model run ... when the required flags are set.

The run method can be found in, oasislmf/model_preparation/exposure_pre_analysis.py
and can apply modifications to each of the 4 OED input files, location, account reinsinfo & reinsscope

New CLI flags
  • --exposure-pre-analysis-setting-json <file-path>, JSON data to passed to the exposure modification module
  • --exposure-pre-analysis-module <file-path>, Exposure Pre-Analysis lookup module path
  • --exposure-pre-analysis-class-name <string>, An optional argument to select a class from the exposure pre-analysis-module

Example

CLI run command

oasislmf model run --exposure-pre-analysis-module custom_module/exposure_pre_analysis_simple.py --exposure-pre-analysis-class-name ExposurePreAnalysis  --exposure-pre-analysis-setting-json tests/data/exposure_pre_analysis_settings.json

Custom exposure modification class

───────┬─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
       │ File: custom_module/exposure_pre_analysis_simple.py
───────┼─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────
   1   │ import pandas as pd
   2   │ 
   3   │ class ExposurePreAnalysis:
   4   │     """
   5   │     Example of custum module called by oasislmf/model_preparation/ExposurePreAnalysis.py
   6   │     """
   7   │ 
   8   │     def __init__(self, raw_oed_location_csv, oed_location_csv, exposure_pre_analysis_setting, **kwargs):
   9   │         self.raw_oed_location_csv = raw_oed_location_csv
  10   │         self.oed_location_csv = oed_location_csv
  11   │         self.exposure_pre_analysis_setting = exposure_pre_analysis_setting
  12   │ 
  13   │     def run(self):
  14   │         panda_df = pd.read_csv(self.raw_oed_location_csv, memory_map=True)
  15   │         panda_df['BuildingTIV'] = panda_df['BuildingTIV'] * self.exposure_pre_analysis_setting['BuildingTIV_multiplyer']
  16   │         panda_df.to_csv(self.oed_location_csv, index=False)
───────┴─────────────────────────────────────────────────────────────────────────────────────────────────────────────────────────

Exposure modification data

───────┬──────────────────────────────────────────────────────────────────────────
       │ File: tests/data/exposure_pre_analysis_settings.json
───────┼──────────────────────────────────────────────────────────────────────────
   1   │ {"BuildingTIV_multiplyer":  2}
───────┴──────────────────────────────────────────────────────────────────────────

Output
There are now two sets of OED files under the oasis files input directory, the set prefixed with epa_*.csv is the modified exposure while the originals are stored as *.csv.

$ tree input/
input/
├── account.csv
├── location.csv
├── reinsinfo.csv
├── reinsscope.csv
    ...
├── epa_account.csv
├── epa_location.csv
├── epa_reinsinfo.csv
└── epa_reinsscope.csv

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 5 years ago

OasisLMF - 1.6.0

New features

  • #480 - Extend calcrules to cover more combinations of financial terms
  • #506 - Improve performance in write_exposure_summary()
  • #523 - Long description field to model_settings.json schema
  • #524 - Total TIV sums in exposure report
  • #527 - Group OED fields from model settings

Issues fixed

  • #481 - Corrections to fm_profile for type 2 terms
  • #503 - Change areaperil id datatype to unit64
  • #512 - Issue in generate rtree index CLI
  • #513 - Breaking change in msgpack 1.0.0
  • #514 - fix ; issues in LocPerilsCovered
  • #515 - Store the loc_id of failed location rows
  • #516 - Refactored the upload_settings method in API client
  • #522 - Update MDK run directory names

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 5 years ago

OasisLMF - 1.5.1

Hot fixed

  • #502 - Change criteria to identify duplicate rows in fm_programme

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 5 years ago

OasisLMF - 1.5.0

New features

Step Policy features supported

  • OED v1.1.0 input format
  • Unlimited steps
  • % TIV trigger and % TIV payout and limit
  • % TIV trigger and % loss payout
  • % TIV trigger and % limit payout
  • % TIV trigger and fixed monetary payout
  • Monetary trigger and % loss payout (franchise deductible)
  • Step trigger types 1 -Buildings 2-Contents 3-Buildings and Contents
  • Debris removal gross up factor
  • Extra expenses gross up factor and limit

Example test cases: fm54 fm55 fm57 fm58

Command line option for setting group_id

[From #453] You can now give multiple fields after the --group-id-col or -G argument on the command line to assign group_id. For example:

$ oasislmf model generate-oasis-files -C oasislmf.json --group-id-col GeogName1 AccNumber

CLI option to set a complex model gulcalc command

[From #474] a custom binary name can be added to the oasislmf.json:

{
   ...
   "model_custom_gulcalc":  "my_gul_piwind.py",
}

This will raise a run error if the binary is not found on the path, rather than reverting to the reference ktools gulcalc binary

Run error: Custom Gulcalc command "my_gul_piwind.py" explicitly set but not found in path.

Update to the Model Settings schema

[From #478 #484 #485] Added the following, for more details see the pull request notes

Under model_settings

  • numeric_parameters, for unbound floats and integers
  • dropdown_parameters, same format as 'events' or 'occurrence' but under a generic 'name'
  • parameter_groups, for grouping related parameters of different types

New section data_settings

Added for datafile_selectors which is used to connect an uploaded file under /v1/data_files/ to an analyses

{
   "model_settings":{ 
        ...
   },
   "lookup_settings": {
       ...
   }
   "data_settings":{
      "datafile_selectors":[
          { ... }
      ]
   }
}

Issues fixed

  • #491 - in oasislmf exposure run command
  • #477 - setup.py fails when behind a proxy
  • #482 - symlink error when rerunning analysis using existing analysis_folder
  • #460 - CLI, remove use of lowercase '-c'
  • #493 - generate_model_losses fix for spaces in file paths
  • #475 - Prevent copy of model_data directory on OSError
  • #486 - Run error using pandas==1.0.0
  • #459 - File generation issue in fm_programme
  • #456 - Remove calls to get_ids in favour of pandas groupby
  • #492 - ComplexModel error guard run in sub-shell
  • #451 - ComplexModel error guard in bash script
  • #415 - RI String matching issue
  • #462 - Issue in fm_programme file generation
  • #463 - Incorrect limits in FM file generation
  • #468 - Fifo issue in Bash script generation

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild about 5 years ago

OasisLMF - 1.4.7rc1

Changelog

  • #415 - Fix RI matching issue
  • #462 - Fix Issue in fm_programmes file generation
  • #464 - Update API client with new Queued Job states
  • #453 - Allow user to select group_id based on columns in loc file
  • #463 - Fix incorrect limits in FM file generation
  • #468 - Fix fifo issue in Bash script generation
  • #454 - Update model_settings.json schema
  • #474 - Added option to set gulcalc command - raises an error if not in path

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 5 years ago

OasisLMF - 1.4.6

Changes

  • Fixed #451 - Complex models can now run with the error_guard option
  • Fixed #452 - Check available columns before creating summary groups
  • Update to model_settings.json schema structure
  • Added Ktools logging for components start/stop, under <run_dir>/log

Climate Change - Natural Hazard and Storm - Python
Published by sambles over 5 years ago

OasisLMF - 1.4.5

Changes

  • Fix for fm_programme mapping
  • Fix for IL files generation
  • Fix issue #439 - il summary groups
  • Reduce memory use in GUL inputs generation (#440)
  • Update to API client - handle rotating refresh token
  • Update JSON schema files (#438)
  • Update API client - add settings JSON endpoints (#444)
  • Add fully correlated option to MDK (#446)
  • Add data type conversion and check for valid OED peril codes (#448)

Breaking changes

  • Removed the --summarise-exposure flag under oasislmf model [run|generate-oasis-files]. This setting is now on by default to more closely match docker based runs. To revert to the previous behaviour add --disable-summarise-exposure

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 5 years ago

OasisLMF - 1.4.4

Changes

  • Hotfix - Added the run flag --ktools-disable-guard option for complex models & custom binaries #434

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 5 years ago

OasisLMF - 1.4.3

Changes

  • Added support for compressed file extensions #427
  • Fix docker kill error #425
  • Fix in IL inputs #422
  • Fix for summary info data types #426
  • Update IL allocations rules #431
  • Various fixes for CLI #426
  • Various fixes for ktools scripts #432
  • Fixed for relative paths in manager class #412
  • Fix for Leccalc options in anaysis_settings.json #407
  • Fix for eltcalc only model runs #428

Components

ktools 3.1.4

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 5 years ago

OasisLMF - 1.4.2

Changes

  • Added parallel keys lookup using multiprocessing
  • Updated the API client - present list of available models and prompt user for selection
  • Set default confiuration file when running the MDK, running oasislmf model run will search for ./oasislmf.json
  • Simplified MDK flags and added backward compatibility for older flags #391
  • Added oasislmf config update command to auto-update existing configuration files
  • Set the default ktools core count from 2 to all available cores
  • Added oasislmf test model-validation to verify ktools model data files
  • Added error checking for Non-matching calculation rules during oasis files generation #401
  • Added OED json data type files for accounts, ri-scope and ri-info
  • Store the message field for successful lookup for complex models #398

Components

ktools 3.1.3

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 5 years ago

OasisLMF - 1.4.1

  • Added bash autocomplete #386
  • Fix for exposure data types on lookup #387
  • Fix for non-OED fields in summary levels #377
  • Fix in Reinsurance Layer Logic #381
  • Refactor deterministic loss generation #371
  • Added bdist package for OSX #372
  • Added Allocation rule for Ground up loss #376

Climate Change - Natural Hazard and Storm - Python
Published by sambles over 5 years ago

OasisLMF - 1.4.0

Changes

From release OasisLMF release 1.4.0 (released and on), there are two changes relevant to model lookups:

  • all custom lookups, including complex lookups, now need to set a loc_id column in the loc. dataframe (in process_locations), at the very start of processing the locations. loc_id is an index on portnumber, accnumber and locnumber (enumerates unique combinations of values in these columns) - locnumber should not be used any longer, as any dataframe with multiple portfolios and/or accounts with duplicate loc. numbers will be processed incorrectly. Note: there is a utility method called get_ids (which you can import from oasislmf.utils.data) for creating composite column indices in any Pandas dataframe that could be useful. To switch to using loc_id you must import this method in your lookup module and then add these lines to the start of the process_locations method (if your lookup subclasses oasislmf.model_preparation.lookup.OasisBaseKeysLookup) or your custom method for generating the keys items
loc_df.columns = loc_df.columns.str.lower()
if 'loc_id' not in loc_df:
    loc_df['loc_id'] = get_ids(loc_df, ['portnumber', 'accnumber', 'locnumber'])

where loc_df is the exposure/locations dataframe.

  • a new gulcalc stream type has been introduced which will replace the existing gulcalc -c and gulcalc -i streams. We did this to enable reporting to be done by sub-peril for all loss perspectives but it has also simplified the ktools design. This change will be backwards compatible but will need to updated to take advantage of the new outputs

Climate Change - Natural Hazard and Storm - Python
Published by awsbuild over 5 years ago