-
Notifications
You must be signed in to change notification settings - Fork 15
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Trigger calculation #73
Comments
Option 1: Use the e.g. 5 year WMS product and do a colour lookup. If it is a 5 year return period then probability needs to exceed 75% for trigger @mazano to set up a new version of our docker image that incliudes pgcron (https://github.com/citusdata/pg_cron) and pl/python @lucernae : To implement the glofas fetch routine e.g. every day and populate any new floods |
I will explain here what I imagine would happen: The source of recommendation is based on Hassan's current recommendation from field visit report (will attach it in the repo later). The criteriaThere are two stage triggers: Pre activation and activation. Pre activationQuoted from the recommendation:
ActivationWhichever municipality area that is in the stage of Pre-activation should be checked again for Activation trigger.
Preparing Flood Depth Map based on Return PeriodWe have to have a flood depth map (continuous depth or classified by depth), that is associated by a return period. We need to have this flood map ready in the database.
I can't @timlinux , I don't have the flood model. My suggestion would be to treat the 100 year flood as 10 or 20 year. The trigger conditions that Hassan proposed is for more than 10 year, so 100 year can be used for now. If we have 10/20 year return period map, we can switch accordingly. Fetch GloFAS information in the background, dailyThere are two ways (alternatively) of getting GloFAS forecast. I will explain accordingly. Getting forecast from reporting pointsThis approach is limited to area near the reporting points.
In addition to that, there is a barchart like this: The barchart contains information on EPS in a certain date. We use this to get lead time information. So, by daily querying reporting points, we are looking at:
By doing this, we only saves to database for Forecast Information that trigger pre-activation/activation triggers, then associate it with floodmap for a corresponding flood map (already existing in the database) with that return period and overlapping station points. Flood Forecast browser in frontend should only shows forecast that is in the database, which is forecast that will trigger pre-activation or activation. Getting Forecast From GloFAS WMS mapWe can get specific prediction for each category, but it doesn't show us the probability. For preactivationIn GloFAS there is WMS map for "Flood summary for days 11-30". For activationIn GloFAS there is WMS map for "Flood summary for days 4-10". Pros/Cons for the approachGetting forecast from reporting pointsPros:
Cons:
Getting forecast from WMS mapPros:
Cons: |
Current accepted approach: Forecast fetch
Proposed approach for impact limit evaluations: Impact limit evaluation
Proposed trigger status evaluation: Trigger status escalations
|
Thanks @lucernae the impact limit evaluation looks great - I expect we will need to tweak this over time but hopefully that is just a matter of changing some variables in your code. For the trigger status escalations, my understanding from Catalina is that the trigger status is based on time. So if
Is that your same view or were you thinking of different logic? |
Yes, same kind of logic. For pre-activation, lead_time criteria is minimum 10 days. There are no criteria for STOP too for the moment.
I accommodated changing the parameter of the criteria (like the limit value, lead time value, etc). For the logic itself (what to compare, when to compare, what to update), it is still a python code. I hope we can refactor it into backend modules (python modules) for easier modification later on. At this moment, it was a giant single class in a script. |
Thanks @lucernae |
Pre-computed flood boundary for e.g. 100 year
@lucernae generate a sample dataset for 20year since the trigger conditions Hassan proposed are for 10 or 20 years. @lucernae make some sample datasets.
http://www.globalfloods.eu/proxy/?srv=ows&SERVICE=WMS&VERSION=1.3.0&REQUEST=GetFeatureInfo&FORMAT=image%2Fpng&TRANSPARENT=true&QUERY_LAYERS=sumAL43EGE%2CreportingPoints&TIME=2019-12-01T00%3A00%3A00&LAYERS=sumAL43EGE%2CreportingPoints&INFO_FORMAT=application%2Fjson&I=797&J=75&WIDTH=832&HEIGHT=832&CRS=EPSG%3A3857&STYLES=&BBOX=9966573.160085278%2C-1304525.282733674%2C11323279.454128297%2C52181.01130934769
The text was updated successfully, but these errors were encountered: