ThinkHazard!
Provides a general view of the hazards, for a given location, that should be considered in project design and implementation to promote disaster and climate resilience.
https://github.com/GFDRR/thinkhazard
Category: Climate Change
Sub Category: Natural Hazard and Storm
Keywords from Contributors
geomapfish geotools jasperreports ogc reporting openlayers corse mapstore geojson
Last synced: about 18 hours ago
JSON representation
Repository metadata
ThinkHazard!
- Host: GitHub
- URL: https://github.com/GFDRR/thinkhazard
- Owner: GFDRR
- License: gpl-3.0
- Created: 2015-01-27T13:46:42.000Z (about 10 years ago)
- Default Branch: master
- Last Pushed: 2025-04-08T14:33:22.000Z (20 days ago)
- Last Synced: 2025-04-25T14:07:24.934Z (3 days ago)
- Language: Python
- Homepage: http://thinkhazard.org/
- Size: 22.5 MB
- Stars: 38
- Watchers: 16
- Forks: 19
- Open Issues: 36
- Releases: 1
-
Metadata Files:
- Readme: README.md
- Changelog: CHANGES.rst
- License: LICENSE
- Support: support_tools/ADM_processing/Patches/ADM0_patch.cpg
- Authors: AUTHORS.txt
README.md
ThinkHazard
A natural hazard screening tool for disaster risk management project planning. ThinkHazard! is maintained by the Global Facility for Disaster Reduction and Recovery (GFDRR). Provides classified hazard level (very low to high) for any location in the world, and advice on managing disaster risk, plus useful reports and contacts, for 11 natural hazards.
API instructions can be found here: https://github.com/GFDRR/thinkhazard/blob/master/API.md
Getting Started
The following commands assume that the system is Debian/Ubuntu. Commands may need to be adapted when working on a different system.
Build docker images:
make build
Run the composition:
docker compose up -d
make initdb
Now point your browser to http://localhost:8080.
Run checks and automated tests:
make check test
Initialize a fresh database
Install postgres unaccent
extension database engine :
sudo apt install postgresql-contrib
Edit /etc/postgresql/9.5/main/postgresql.conf
, and set max_prepared_transactions
to 10
Create a database:
sudo -u postgres createdb -O www-data thinkhazard_admin
sudo -u postgres psql -d thinkhazard_admin -c 'CREATE EXTENSION postgis;'
sudo -u postgres psql -d thinkhazard_admin -c 'CREATE EXTENSION unaccent;'
sudo -u postgres createdb -O www-data thinkhazard
sudo -u postgres psql -d thinkhazard -c 'CREATE EXTENSION postgis;'
sudo -u postgres psql -d thinkhazard -c 'CREATE EXTENSION unaccent;'
If you want to use a different user or different database name, you’ll have to provide your own configuration file. See “Use local.ini” section below.
Create the required schema and tables and populate the enumeration tables:
make populatedb
Note: this may take a while. If you don’t want to import all the world administrative divisions, you can import only a subset:
make populatedb DATA=turkey
or:
make populatedb DATA=indonesia
In order to harvest geonode instance with full access, you need to create and
configure an API key.
On geonode side:
- create a superuser with following command:
python manage.py createsuperuser
- Then, create api keys for all users with:
python manage.py backfill_api_keys
- Finally, you can display all api keys with:
SELECT people_profile.id, username, key
FROM people_profile
LEFT JOIN tastypie_apikey ON (tastypie_apikey.user_id = people_profile.id)
On Thinkhazard side:
- Change username and api_key value according to previous setup in
thinkhazard_processing.yaml
file.
You’re now ready to harvest, download and process the data:
make harvest
make download
make complete
make process
make decisiontree
For more options, see:
make help
Processing tasks
Administrator can also launch the different processing tasks with more options.
docker compose run --rm thinkhazard harvest [--force] [--dry-run]
Harvest metadata from GeoNode, create HazardSet and Layer records.
docker compose run --rm thinkhazard download [--title] [--force] [--dry-run]
Download raster files in data folder.
docker compose run --rm thinkhazard complete [--force] [--dry-run]
Identify hazardsets whose layers have been fully downloaded, infer several fields and mark these hazardsets complete.
docker compose run --rm thinkhazard process [--hazardset_id ...] [--force] [--dry-run]
Calculate output from hazardsets and administrative divisions.
docker compose run --rm thinkhazard decision_tree [--force] [--dry-run]
Apply the decision tree followed by upscaling on process outputs to get the final relations between administrative divisions and hazard categories.
Publication of admin database on public site
Publication consist in overwriting the public database with the admin one. This can be done using :
make publish
And this will execute as follow :
- Lock the public site in maintenance mode.
- Store a publication date in the admin database.
- Backup the admin database in archives folder.
- Create a new fresh public database.
- Restore the admin backup into public database.
- Unlock the public site from maintenance mode.
Configure admin username/password
Authentication is based on environment variable HTPASSWORDS
which should contain
usernames and passwords using Apache htpasswd
file format.
To create an authentification file .htpasswd
with admin
as the initial user:
htpasswd -c .htpasswd admin
It will prompt for the passwd.
Add or modify username2
in the password file .htpasswd
:
htpasswd .htpasswd username2
Then pass the content of the file to environment variable:
environment:
HTPASSWORDS: |
admin:admin
user:user
Analytics
If you want to get some analytics on the website usage (via Google analytics), you can add the tracking code using an analytics variable:
environment:
ANALYTICS: UA-75301865-1
Feedback
The feedback_form_url
can be configured in the production.ini
file.
Configuration of processing parameters
The configuration of the threshold, return periods and units for the different hazard types can be done via the thinkhazard_processing.yaml
.
After any modification to this file, next harvesting will delete all layers, hazardsets and processing outputs. This means that next processing task will have to treat all hazardsets and may take a while (close to one hour).
hazard_types
Harvesting and processing configuration for each hazard type. One entry for each hazard type mnemonic.
Possible subkeys include the following:
-
hazard_type
: Corresponding hazard_type value in geonode. -
return_periods
: One entry per hazard level mnemonic with corresponding return periods. Each return period can be a value or a list with minimum and maximum values, example:return_periods: HIG: [10, 25] MED: 50 LOW: [100, 1000]
-
thresholds
: Flexible threshold configuration.This can be a simple and global value per hazardtype. Example:
thresholds: 1700
But it can also contain one or many sublevels for complex configurations:
global
andlocal
entries for corresponding hazardsets.- One entry per hazard level mnemonic.
- One entry per hazard unit from geonode.
Example:
thresholds: global: HIG: unit1: value1 unit2: value2 MED: unit1: value1 unit2: value2 LOW: unit1: value1 unit2: value2 MASK: unit1: value1 unit2: value2 local: unit1: value1 unit2: value2
-
values
: One entry per hazard level, with list of corresponding values in preprocessed layer. If present, the layer is considered as preprocessed, and the abovethresholds
andreturn_periods
are not taken into account. Example:values: HIG: [103] MED: [102] LOW: [101] VLO: [100, 0]
Translations
ThinkHazard! is translated using Transifex
.
Workflow
We use lingua to extract translation string from jinja2
templates.
Use the following command to update the gettext template (.pot
):
make extract_messages
Note: this should be done from the production instance ONLY in order to have
the up-to-date database strings extracted!
You will have to make sure that the ~/.transifexrc
is valid and the
credentials correspond to the correct rights.
Then you can push the translation sources to transifex.
make transifex-push
Once the translations are OK on Transifex it's possible to pull the translations:
make transifex-pull
Don't forget to compile the catalog (ie. convert .po to .mo):
make compile_catalog
Development
There are 3 different ways to translate strings in the templates:
-
translate
filterThis should be used for strings corresponding to enumeration tables in
database.{{ hazard.title | translate }}
-
gettext
methodTo be used for any UI string.
{{gettext('Download PDF')}}
-
model class method
Some model classes have specific method to retrive the value from a field
specific to chosen language.{{ division.translated_name(request.locale_name)}}
Owner metadata
- Name: Global Facility for Disaster Reduction and Recovery (GFDRR)
- Login: GFDRR
- Email:
- Kind: organization
- Description: GFDRR supports developing countries on disaster risk reduction and climate change adaptation
- Website: https://www.gfdrr.org/en
- Location:
- Twitter: GFDRR
- Company:
- Icon url: https://avatars.githubusercontent.com/u/708300?v=4
- Repositories: 83
- Last ynced at: 2024-04-15T14:43:00.778Z
- Profile URL: https://github.com/GFDRR
GitHub Events
Total
- Watch event: 3
- Delete event: 1
- Push event: 3
- Pull request event: 1
- Create event: 2
Last Year
- Watch event: 3
- Delete event: 1
- Push event: 3
- Pull request event: 1
- Create event: 2
Committers metadata
Last synced: 8 days ago
Total Commits: 1,185
Total Committers: 16
Avg Commits per committer: 74.063
Development Distribution Score (DDS): 0.64
Commits in past year: 3
Committers in past year: 1
Avg Commits per committer in past year: 3.0
Development Distribution Score (DDS) in past year: 0.0
Name | Commits | |
---|---|---|
Pierre GIRAUD | p****d@c****m | 427 |
[email protected] | a****n@c****m | 267 |
Antoine Abt | a****e@a****m | 208 |
Éric Lemoine | e****e@g****m | 121 |
Tobias Kohr | t****r@c****m | 41 |
François Van Der Biest | f****t@c****m | 40 |
Stuart Fraser | s****1@g****m | 37 |
marionb | m****r@c****m | 19 |
Julien Acroute | j****e@c****m | 8 |
Javier Daza | j****s@g****m | 5 |
tsauerwein | t****n@c****m | 4 |
Mattia Amadio | 4****C | 3 |
Mattia Amadio | 4****o | 2 |
Ariel Núñez | i****l@g****m | 1 |
Arnaud Morvan | a****n@g****m | 1 |
Marion Baumgartner | m****r@i****l | 1 |
Committer domains:
Issue and Pull Request metadata
Last synced: 2 days ago
Total issues: 36
Total pull requests: 68
Average time to close issues: over 1 year
Average time to close pull requests: about 1 month
Total issue authors: 8
Total pull request authors: 5
Average comments per issue: 3.08
Average comments per pull request: 0.54
Merged pull request: 60
Bot issues: 0
Bot pull requests: 5
Past year issues: 0
Past year pull requests: 3
Past year average time to close issues: N/A
Past year average time to close pull requests: about 3 hours
Past year issue authors: 0
Past year pull request authors: 2
Past year average comments per issue: 0
Past year average comments per pull request: 0.0
Past year merged pull request: 1
Past year bot issues: 0
Past year bot pull requests: 2
Top Issue Authors
- stufraser1 (16)
- matamadio (12)
- vdeparday (3)
- pgiraud (1)
- martyclark (1)
- arnaud-morvan (1)
- pzwsk (1)
- EceOzen (1)
Top Pull Request Authors
- arnaud-morvan (42)
- tkohr (15)
- dependabot[bot] (5)
- tonio (5)
- stufraser1 (1)
Top Issue Labels
- 2ndpriority (6)
- v3 potential (5)
- 1stpriority (4)
- enhancement (4)
- 3rdpriority (3)
- Support (1)
- bug (1)
- data issue (1)
Top Pull Request Labels
- dependencies (5)
- javascript (2)
Dependencies
- bootlint ~0.11.0
- bootstrap 3.3.7
- casperjs ~1.1.4
- d3 ~3.5.16
- d3-geo-projection ~0.2.16
- font-awesome ~4.5.0
- jquery ~3.3.1
- jshint ~2.5.1
- leaflet ~0.7.7
- less ~1.7.5
- openlayers ~4.0.1
- topojson ~1.6.24
- typeahead.js ~0.11.1
- WebTest * development
- coverage * development
- flake8 * development
- mock * development
- nose * development
- pep8-naming * development
- pydevd * development
- APScheduler *
- Babel *
- PyPDF2 *
- SQLAlchemy *
- alembic *
- asyncio *
- boto3 *
- celery *
- colorlog *
- geoalchemy2 *
- gunicorn *
- httplib2 *
- jinja2 *
- lingua *
- markdown *
- papyrus *
- passlib *
- paste *
- psycopg2 *
- pyppeteer *
- pyproj *
- pyquery *
- pyramid <2
- pyramid_debugtoolbar *
- pyramid_jinja2 *
- pyramid_tm *
- python-slugify <2.0.0
- pytidylib *
- pyyaml *
- rasterio *
- redis *
- requests *
- requests_futures *
- secure <1,>=0.3
- shapely *
- simplejson *
- transaction *
- transifex-client *
- waitress *
- zope.sqlalchemy *
- actions/checkout v3 composite
- base latest build
- python 3.8-slim-buster build
- camptocamp/postgres 12 build
Score: 7.076653815443951