Releases: gem/oq-engine
OpenQuake Engine 3.8.0
[Graeme Weatherill (@g-weatherill)]
- Updates SERA Craton GMPE to incorporate NGA East site response and reflect
changes in CEUS USGS model
[Michele Simionato (@micheles)]
- The total loss curves in event_based_risk are now built with pandas
- Added an option
oq engine --param
to override the job.ini parameters - Internal: reduced the number of NGAEastUSGS classes from 39 to 1
- Internal: reduced the number of NGAEast classes from 44 to 2
- Internal: reduced the 15 NSHMP2014 classes to a single class
- Internal: reduced the 22 NBCC2015_AA13 classes to a single class
[Graeme Weatherill (@g-weatherill)]
-
Adds complete suite of GMPEs for the Central and Eastern US, as adopted
within the 2018 US National Seismic Hazard Map -
Implements NGA East site amplification model within NGA East Base class
-
Implemented site amplification by convolution
-
Improved the error message if the
event_id
does not start from zero in
the gmfs.csv files -
Changed the rupture exporter to export LINESTRINGs instead of degenerate
POLYGONs -
Introduced
minimum_loss_fraction
functionality in ebrisk -
Refined the rupture prefiltering mechanism, possibly changing the numbers
in calculations with nonzero coefficients of variations -
Optimized the generation of aggregate loss curves in ebrisk
-
Introduced an experimental AvgGMPE and used it to implement (optional)
reduction of the gsim logic tree
[Graeme Weatherill (@g-weatherill)]
- Implemented Abrahamson et al (2018) update of the BC Hydro GMPE
- Added configurable nonergodic sigma option to BC Hydro and SERA GMPEs
- Small refactoring and bug fix in average SA GMPE
[Michele Simionato (@micheles)]
- Avoided reading multiple times the GSIM logic tree
- Changed the GSIM logic tree sampling by ordering the branches by TRT
- Ignored IMT-dependent weights when using sampling to make such calculations
possible - Storing (partially) the asset loss table
[Robin Gee (@rcgee)]
- Set DEFINED_FOR_REFERENCE_VELOCITY in CampbellBozorgnia2003NSHMP2007
[Graeme Weatherill (@g-weatherill)]
- Re-adjustment of SERA Subduction model epistemic scaling factors
[Michele Simionato (@micheles)]
- Improved the task distribution in the ebrisk calculator
- Fixed a bug in ebrisk with aggregate_by when building the rup_loss_table
- Storing the asset loss table in scenario_risk, but only for assets and
events above over aloss_ratio_threshold
parameter - Storing the asset damage table in scenario_damage and event based damage,
but only for assets and events above acollapse_threshold
parameter - Avoided transferring the GMFs upfront in scenario_damage, scenario_risk
and event_based_damage
[Daniele Viganò (@daniviga)]
- Included pandas in the engine distribution
[Michele Simionato (@micheles)]
- Avoided reading multiple time the gsim logic tree file and relative files
- Added a check for duplicate sites in the site model file
- Implemented an event_based_damage calculator
- Added an API /v1/calc/ID/extract/gmf_data?event_id=XXX
- Added an API /v1/calc/ID/extract/num_events
- Fixed the /v1/calc/ID/status endpoint to return an error 404 when needed
- Removed the "sites are overdetermined" check, since it now unneeded
- Turned the calculation of consequences into a plugin architecture
[Matteo Nastasi (@nastasi-oq)]
- Add '/v1/ini_defaults' web api entry point to retrieve all default
values for ini attributes (attrs without a default are not returned)
[Michele Simionato (@micheles)]
- Renamed rlzi -> rlzi in the sigma-epsilon dataset and exporter
- Renamed id -> asset_id in all the relevant CSV exporters
- Renamed rlzi -> rlz_id in the dmg_by_event.csv output
- Renamed rupid -> rup_id in the ruptures.csv output
- Renamed id -> event_id in the events.csv output and gmfs.csv output
- Renamed sid -> site_id in the gmfs.csv output
- Renamed ordinal -> rlz_id in the realizations.csv output
[Alberto Chiusole (@bebosudo)]
- Changed the way how the available number of CPU cores is computed
[Kendra Johnson (@kejohnso), Robin Gee (@rcgee)]
- Added GMPEs for Rietbrock-Edwards (2019) and Yenier-Atkinson (2015)
[Michele Simionato (@micheles)]
- Added more check on the IMTs and made it possible to import a GMF.csv
file with more IMTs than needed - Enabled magnitude-dependent pointsource_distance
- Removed the syntax for magnitude-dependent maximum distance, since
now it can be automatically determined by the engine - Saving more information in the case of single-site classical hazard
- Extended
pointsource_distance
to generic sources - Removed the boundary information from the CSV rupture exporter
- Changed the /extract/rupture/XXX API to returns a TOML that can be
used by a scenario calculator - Added general support for file-reading GMPEs
- Made it possible to disaggregate on multiple realizations
with the parametersrlz_index
ornum_rlzs_disagg
- Fixed downloading the ShakeMaps (again)
- Better error message in case of too large maximum_distance
- Optimized the case of point sources with an hypocenter distribution and
GSIMs independent from it and in general the case of ruptures with
similar distances
[Graeme Weatherill (@g-weatherill)]
- Updates SERA craton GMPE to reflect updates to NGA East site response model
[Michele Simionato (@micheles)]
- Fixed and HDF5 SWMR issue in large disaggregation calculations
- Made
rrup
the unique acceptablefilter_distance
- Fixed disaggregation with a parent calculation
- Models with duplicated values in the hypocenter and/or nodal plane
distributions are now automatically optimized - Fixed an issue with missing noDamageLimit causing NaN values in
scenario_damage calculations - Added more validations for predefined hazard, like forbidding the site model
[Marco Pagani (@mmpagani)]
- Adding the shift_hypo option for distributed seismicity
[Michele Simionato (@micheles)]
- Raised an early error for extra-large GMF calculations
- Reduced the GMF storage by using 32 bit per event ID instead of 64 bit
- Raised an error in case of duplicated sites in the site model
- Fixed the case of implicit grid with a site model: sites could be
incorrectly discarded - Fixed the ShakeMap downloader to find also unzipped
uncertaintly.xml
files - Fixed the rupture exporters to export the rupture ID and not the
rupture serial - Removed the non-interesting
agg_maps
outputs - Changed the task distribution in the classical calculator and added
atask_multiplier
parameter
[Marco Pagani (@mmpagani)]
- Fixed a bug in the GenericGmpeAvgSA
[Michele Simionato (@micheles)]
- Added a
/v1/calc/validate_zip
endpoint to validate input archives - Deprecated inferring the intensity measure levels from the risk functions
- Fixed a too strict check on the minimum intensities of parent an child
calculations - Extended the ebrisk calculator to compute at the same time both the
aggregate curves by tag and the total curves
[Marco Pagani (@mmpagani)]
- Implemented Morikawa and Fujiwara (2013) GMM
[Michele Simionato (@micheles)]
- Changed the seed algorithm in sampling with more than one source model,
thus avoiding using more GMPEs than needed in some cases - If
ground_motion_fields=false
is set, the GMFs are not stored even
ifhazard_curves_from_gmfs=true
oq show job_info
now works while the calculation is running- Reduced the sent data transfer in ebrisk calculations
- Deprecated the old syntax for the
reqv
feature - Added short aliases for hazard statistics
mean
,max
andstd
- Reduced substantially the memory occupation in the task queue
- Added an API
/extract/sources
and an experimentaloq plot sources
- Added a check on valid input keys in the job.ini
- Fixed the check on dependent calculations
- Specifying at the same time both a grid and individual sites is an error
[Daniele Viganò (@daniviga)]
- Docker containers rebased on CentOS 8
- Fixed an issue causing zombie
ssh
processes
when usingzmq
as task distribution mechanism - Introduced support for RHEL/CentOS 8
[Michele Simionato (@micheles)]
- Added a check for no GMFs in event_based_risk
- Avoided transferring the site collection
- Storing the sources in TOML format
OpenQuake Engine 3.7.1
[Michele Simionato (@micheles)]
- Fixed disaggregation with a parent calculation
- Fixed the case of implicit grid with a site model: sites could be
incorrectly discarded - Fixed the ShakeMap downloader to find also unzipped
uncertaintly.xml
files - Fixed the rupture exporters to export the rupture ID and not the
rupture serial
[Marco Pagani (@mmpagani)]
- Fixed a bug in the GenericGmpeAvgSA
OpenQuake Engine 3.7.0
[Michele Simionato (@micheles)]
- Hiding calculations that fail before the pre-execute phase (for instance,
because of missing files); they already give a clear error - Added an early check on truncation_level in presence of correlation model
[Guillaume Daniel (@guyomd)]
- Implemented Ameri (2017) GMPE
[Michele Simionato (@micheles)]
- Changed the ruptures CSV exporter to use commas instead of tabs
- Added a check forbidding
aggregate_by
for non-ebrisk calculators - Introduced a task queue
- Removed the
cache_XXX.hdf5
files by using the SWMR mode of h5py
[Kris Vanneste (@krisvanneste)]
- Updated the coefficients table for the atkinson_2015 to the actual
values in the paper.
[Michele Simionato (@micheles)]
- Added an
/extract/agg_curves
API to extract both absolute and relative
loss curves from an ebrisk calculation - Changed
oq reset --yes
to remove oqdata/user only in single-user mode - Now the engine automatically sorts the user-provided intensity_measure_types
- Optimized the aggregation by tag
- Fixed a bug with the binning when disaggregating around the date line
- Fixed a prefiltering bug with complex fault sources: in some cases, blocks
ruptures were incorrectly discarded - Changed the sampling algorithm for the GMPE logic trees: now it does
not require building the full tree in memory - Raised clear errors for geometry files without quotes or with the wrong
header in the multi_risk calculator - Changed the realizations.csv exporter to export '[FromShakeMap]' instead
of '[FromFile]' when needed - Changed the agg_curves exporter to export all realizations in a single file
and all statistics in a single file - Added rlz_id, rup_id and year to the losses_by_event output for ebrisk
- Fixed a bug in the ruptures XML exporter: the multiplicity was multiplied
(incorrectly) by the number of realizations - Fixed the pre-header of the CSV outputs to get proper CSV files
- Replaced the 64 bit event IDs in event based and scenario calculations
with 32 bit integers, for the happiness of Excel users
[Daniele Viganò (@daniviga)]
- Numpy 1.16, Scipy 1.3 and h5py 2.9 are now required
[Michele Simionato (@micheles)]
- Changed the ebrisk calculator to read the CompositeRiskModel directly
from the datastore, which means 20x less data transfer for Canada
[Anirudh Rao (@raoanirudh)]
- Fixed a bug in the gmf CSV importer: the coordinates were being
sorted and new site_ids assigned even though the user input sites
csv file had site_ids defined
[Michele Simionato (@micheles)]
- Fixed a bug in the rupture CSV exporter: the boundaries of a GriddedRupture
were exported with lons and lats inverted - Added some metadata to the CSV risk outputs
- Changed the distribution mechanism in ebrisk to reduce the slow tasks
[Graeme Weatherill (@g-weatherill)]
- Updates Kotha et al. (2019) GMPE to July 2019 coefficients
- Adds subclasses to Kotha et al. (2019) to implement polynomial site
response models and geology+slope site response model - Adds QA test to exercise all of the SERA site response calculators
[Michele Simionato (@micheles)]
- Internal: there is not need to call
gsim.init()
anymore
[Graeme Weatherill (@g-weatherill)]
- Adds parametric GMPE for cratonic regions in Europe
[Michele Simionato (@micheles)]
- In the agglosses output of scenario_risk the losses were incorrectly
multiplied by the realization weight - Removed the output
sourcegroups
and added the outputevents
[Graeme Weatherill (@g-weatherill)]
- Adds new meta ground motion models to undertake PSHA using design code
based amplification coefficients (Eurocode 8, Pitilakis et al., 2018) - Adds site amplification model of Sandikkaya & Dinsever (2018)
[Marco Pagani (@mmpagani)]
- Added a new rupture-site metric: the azimuth to the closest point on the
rupture
[Michele Simionato (@micheles)]
- Fixed a regression in disaggregation with nonparametric sources, which
were effectively discarded - The site amplification has been disabled by default in the ShakeMap
calculator, since it is usually already taken into account by the USGS
[Daniele Viganò (@daniviga)]
- Deleted calculations are not removed from the database anymore
- Removed the 'oq dbserver restart' command since it was broken
[Richard Styron (@cossatot)]
- Fixed
YoungsCoppersmith1985MFD.from_total_moment_rate()
: due to numeric
errors it was producing incorrect seismicity rates
[Michele Simionato (@micheles)]
- Now we generate the output
disagg_by_src
during disaggregation even in the
case of multiple realizations - Changed the way the random seed is set for BT and PM distributions
- The filenames generated by disagg_by_src exporter now contains the site ID
and not longitude and latitude, consistently with the other exporters - Accepted again meanLRs greater than 1 in vulnerability functions of kind LN
- Fixed a bug in event based with correlation and a filtered site collection
- Fixed the CSV exporter for the realizations in the case of scenarios
with parametric GSIMs - Removed some misleading warnings for calculations with a site model
- Added a check for missing
risk_investigation_time
in ebrisk - Reduced drastically (I measured improvements over 40x) memory occupation,
data transfer and data storage for multi-sites disaggregation - Sites for which the disaggregation PoE cannot be reached are discarded
and a warning is printed, rather than killing the whole computation oq show performance
can be called in the middle of a computation again- Filtered out the far away distances and reduced the time spent in
saving the performance info by orders of magnitude in large disaggregations - Reduced the data transfer by reading the data directly from the
datastore in disaggregation calculations - Reduced the memory consumption sending disaggregation tasks incrementally
- Added an extract API disagg_layer
- Moved
max_sites_disagg
from openquake.cfg into the job.ini - Fixed a bug with the --config option: serialize_jobs could not be overridden
- Implemented insured losses
OpenQuake Engine 3.6.0
[Michele Simionato (@micheles)]
- In some cases
applyToSources
was giving a fake error about the source
not being in the source model even if it actually was
[Chris Van Houtte (@cvanhoutte)]
- Adds the Van Houtte et al. (2018) significant duration model for New
Zealand
[Michele Simionato (@micheles)]
- Added a way to compute and plot the MFD coming from an event based
- Storing the MFDs in TOML format inside the datastore
[Robin Gee (@rcgee)]
- Moves b4 constant into COEFFS table for GMPE Sharma et al., 2009
[Graeme Weatherill (@g-weatherill)]
- Adds functionality to Cauzzi et al. (2014) and Derras et al. (2014)
calibrated GMPEs for Germany to use either finite or point source distances
[Michele Simionato (@micheles)]
- Restored the ability to associate site model parameters to a grid of sites
- Made it possible to set
hazard_curves_from_gmfs=true
with
ground_motion_fields=false
in the event based hazard calculator - Introduced a mechanism to split the tasks based on an estimated duration
- Integrated
oq plot_memory
intooq plot
- Removed
NaN
values for strike and dip when exporting griddedRuptures - Fixed
oq reset
to work in multi-user mode - Extended the source_id-filtering feature in the job.ini to multiple sources
- Supported WKT files for the binary perils in the multi_risk calculator
- Added an early check on the coefficients of variation and loss ratios of
vulnerability functions with the Beta distribution - Made sure that
oq engine --dc
removes the HDF5 cache file too - Removed the flag
optimize_same_id_sources
because it is useless now - Introduced a soft limit at 65,536 sites for event_based calculations
- Fixed a performance regression in ucerf_classical that was filtering
before splitting, thus becoming extra-slow - Improved the progress log, that was delayed for large classical calculations
- Exported the ruptures as 3D multi-polygons (instead of 2D ones)
- Changed the
aggregate_by
exports for consistency with the others - Changed the losses_by_event exporter for ebrisk, to make it more
consistent with scenario_risk and event_based_risk - Changed the agglosses and losses_by_event exporters in scenario_risk,
by adding a column with the realization index - Changed the generation of the hazard statistics to consume very little
memory - Fixed a bug with concurrent_tasks being inherited from the parent
calculation instead of using the standard default - Removed the dependency from mock, since it is included in unittest.mock
- For scenario, replaced the
branch_path
with the GSIM representation in
the realizations output - Added a check for suspiciously large source geometries
- Deprecated the XML disaggregation exporters in favor of the CSV exporters
- Turned the disaggregation calculator into a classical post-calculator
to use the precomputed distances and speedup the computation even more - Fixed the disaggregation calculator by discarding the ruptures outside
the integration distance - Optimized the speed of the disaggregation calculator by moving a statistical
functions outside of the inner loop - Changed the file names of the exported disaggregation outputs
- Fixed an export agg_curves issue with pre-imported exposures
- Fixed an export agg_curves issue when the hazard statistics are different
from the risk statistics - Removed the disaggregation statistics: now the engine disaggregates only on
a single realization (default: the closest to the mean) - Forbidden disaggregation matrices with more than 1 million elements
- Reduced the data transfer when computing the hazard curves
- Optimized the reading of large CSV exposures
- Fixed the --hc functionality across users
- Optimized the reduction of the site collection on the exposure sites
- Made more robust the gsim logic tree parser: lines like
<uncertaintyModel gmpe_table="../gm_tables/Woffshore_low_clC.hdf5">
are accepted again - Added a check against duplicated values in nodal plane distributions and
hypocenter depth distributions - Changed the support for zipped exposures and source models: now the
name of the archive must be written explicitly in the job.ini - Added support for numpy 1.16.3, scipy 1.3.0, h5py 2.9.0
- Removed the special case for event_based_risk running two calculations
[Graeme Weatherill (@g-weatherill)]
- Adds the Tromans et al. (2019) adjustable GMPE for application to PSHA
in the UK
[Michele Simionato (@micheles)]
- Optimized src.sample_ruptures for (multi)point sources and are sources
- Fixed a mutability bug in the DistancesContext and made all context
arrays read-only: the fix may affect calculations using the GMPEs
berge_thierry_2003, cauzzi_faccioli_2008 and zhao_2006; - Fixed a bug with the minimum_distance feature
- Fixed a bug in the exporter of the aggregate loss curves: now the loss
ratios are computed correctly even in presence of occupants - Removed the (long time deprecated) capability to read hazard curves and
ground motion fields from XML files: you must use CSV files instead
[Marco Pagani (@mmpagani)]
- Implemented a modified GMPE that add between and within std to GMPEs only
supporting total std
[Michele Simionato (@micheles)]
- Added the ability to use a taxonomy_mapping.csv file
- Fixed a bug in classical_damage from CSV: for hazard intensity measure
levels different from the fragility levels, the engine was giving incorrect
results - Serialized also the source model logic tree inside the datastore
- Added a check on missing intensity_measure_types in event based
- Fixed
oq prepare_site_model
in the case of an empty datadir - Added a comment line with useful metadata to the engine CSV outputs
- Removed the long time deprecated event loss table exporter for event based
risk and enhanced the losses_by_event exporter to export the realization ID - Removed the long time deprecated GMF XML exporter for scenario
- IMT-dependent weights in the gsim logic tree can be zero, to discard
contributions outside the range of validity of (some of the) GSIMs - Now it is possible to export individual hazard curves from an event
- Added a view gmvs_to_hazard
OpenQuake Engine 3.5.2
OpenQuake Engine 3.5.1
[Michele Simionato (@micheles)]
- Added a
rlzi
column to to sig_eps.csv output - Accepted GMF CSV files without a
rlzi
column - Accepted a list-like syntax like
return_periods=[30, 60, 120, 240, 480]
in the job.ini, as written in the manual - Fixed a bug in the asset_risk exporter for uppercase tags
[Paul Henshaw (@pslh)]
- Fixed an encoding bug while reading XML files on Windows
OpenQuake Engine 3.5.0
[Michele Simionato (@micheles)]
- Added a view gmvs_to_hazard
[Giovanni Lanzano (@giovannilanzanoINGV)]
- Lanzano and Luzi (2019) GMPE for volcanic zones in Italy
[Michele Simionato (@micheles)]
- Now it is possible to export individual hazard curves from an event
based calculation by settinghazard_curves_from_gmfs = true
and
`individual_curves = true (before only the statistics were saved)
[Graeme Weatherill (@g-weatherill)]
- Adds adaptation of Abrahamson et al. (2016) 'BC Hydro' GMPEs calibrated
to Mediterranean data and with epistemic adjustment factors
[Chris Van Houtte (@cvanhoutte)]
- Added new class to bradley_2013b.py for hazard maps
- Modified test case_37 to test multiple sites
[Marco Pagani (@mmpagani)]
- Fixed a bug in the logic tree parser and added a check to forbid logic
trees with applyToSources without applyToBranches, unless there is a
single source model branch
[Michele Simionato (@micheles)]
- Removed the experimental parameter
prefilter_sources
[Daniele Viganò (@daniviga)]
- Multiple DbServer ZMQ connections are restored to avoid errors under heavy
load and/or on slower machines
[Michele Simionato (@micheles)]
- Removed the ugly registration of custom signals at import time: now they
are registered only ifengine.run_calc
is called - Removed the dependency from rtree
- Removed all calls to ProcessPool.shutdown to speed up the tests and to
avoid non-deterministic errors in atexit._run_exitfuncs
[Marco Pagani (@mmpagani)]
- Added tabular GMPEs as provided by Michal Kolaj, Natural Resources Canada
[Michele Simionato (@micheles)]
- Extended the ebrisk calculator to support coefficients of variations
[Graeme Weatherill (@g-weatherill)]
- Adds Kotha et al (2019) shallow crustal GMPE for SERA
- Adds 'ExperimentalWarning' to possible GMPE warnings
- Adds kwargs to check_gsim function
[Michele Simionato (@micheles)]
- Fixed problems like SA(0.7) != SA(0.70) in iml_disagg
- Exposed the outputs of the classical calculation in event based
calculations withcompare_with_classical=true
- Made it possible to serialize together all kind of risk functions,
including consequence functions that before were not HDF5-serializable - Fixed a MemoryError when counting the number of bytes stored in large
HDF5 datasets - Extended
asset_hazard_distance
to a dictionary for usage with multi_risk - Extended oq prepare_site_model to work with sites.csv files
- Optimized the validation of the source model logic tree: now checking
the sources IDs is 5x faster - Went back to the old logic in sampling: the weights are used for the
sampling and the statistics are computed with identical weights - Avoided to transfer the epsilons by storing them in the cache file
and changed the event to epsilons associations - Reduced the data transfer in the computation of the hazard curves, causing
in some time huge speedups (over 100x) - Implemented a flag
modal_damage_state
to display only the most likely
damage state in the outputdmg_by_asset
of scenario damage calculations - Reduced substantially the memory occupation in classical calculations
by including the prefiltering phase in the calculation phase
[Daniele Viganò (@daniviga)]
- Added a 'serialize_jobs' setting to the openquake.cfg
which limits the maximum number of jobs that can be run in parallel
[Michele Simionato (@micheles)]
- Fixed two exporters for the ebrisk calculator (agg_curves-stats and
losses_by_event) - Fixed two subtle bugs when reading site_model.csv files
- Added /extract/exposure_metadata and /extract/asset_risk
- Introduced an experimental multi_risk calculator for volcanic risk
[Guillaume Daniel (@guyomd)]
- Updating of Berge-Thierry (2003) GSIM and addition of several alternatives
for use with Mw
[Michele Simionato (@micheles)]
- Changed the classical_risk calculator to use the same loss ratios for all
taxonomies and then optimized all risk calculators - Temporarily removed the
insured_losses
functionality - Extended
oq restore
to download from URLs - Removed the column 'gsims' from the output 'realizations'
- Better parallelized the source splitting in classical calculations
- Added a check for missing hazard in scenario_risk/scenario_damage
- Improved the GsimLogicTree parser to get the line number information, a
feature that was lost with the passage to Python 3.5 - Added a check against mispellings in the loss type in the risk keys
- Changed the aggregation WebAPI from
aggregate_by/taxonomy,occupancy/avg_losses?kind=mean&loss_type=structural to
aggregate/avg_losses?kind=mean&loss_type=structural&tag=taxonomy&tag=occupancy - Do not export the stddevs in scenario_damage in the case of 1 event
- Fixed export bug for GMFs imported from a file
- Fixed an encoding error when storing a GMPETable
- Fixed an error while exporting the hazard curves generated by a GMPETable
- Removed the deprecated feature aggregate_by/curves_by_tag
OpenQuake Engine 3.4.0
[Michele Simionato (@micheles)]
- Compatibility with 'decorator' version >= 4.2
[Giovanni Lanzano (@giovannilanzanoINGV)]
- Contributed a GMPE SkarlatoudisEtAlSSlab2013
[Michele Simionato (@micheles)]
- Changed the event loss table exporter to export also rup_id and year
- Extended the ebrisk calculator to compute loss curves and maps
[Rodolfo Puglia (@rodolfopuglia)]
- Spectral acceleration amplitudes at 2.5, 2.75 and 4 seconds added
[Marco Pagani (@mmpagani)]
- Improved the event based calculator to account for cluster-based models
[Michele Simionato (@micheles)]
- Removed the now redundant command
oq extract hazard/rlzs
[Daniele Viganò (@daniviga)]
- Fixed 'oq abort' and always mark killed jobs as 'aborted'
[Michele Simionato (@micheles)]
- Made it possible to use in the Starmap tasks without a monitor argument
- Stored the sigma and epsilon parameters for each event in event based
and scenario calculations and extended the gmf_data exporter consequently - Fixed the realizations CSV exporter which was truncating the names of the
GSIMs - Deprecated the XML exporters for hcurves, hmaps, uhs
- Introduced a
sap.script
decorator - Used the WebExtractor in
oq importcalc
- Restored validation of the source_model_logic_tree.xml file
- Raised an early error for missing occupants in the exposure
- Added a check to forbid duplicate file names in the
uncertaintyModel
tag - Made it possible to store the asset loss table in the ebrisk calculator
by specifyingasset_loss_table=true
in the job.ini - Added a flag
oq info --parameters
to show the job.ini parameters - Removed the
source_name
column from the disagg by source output
[Rao Anirudh]
- Fixed wrong investigation_time in the calculation of loss maps from
loss curves
[Robin Gee (@rcgee)]
- Added capability to optionally specify a
time_cutoff
parameter to
declustering time window
[Michele Simionato (@micheles)]
- Merged the commands
oq plot_hmaps
andoq plot_uhs
insideoq plot
- Changed the storage of hazard curves and hazard maps to make it consistent
with the risk outputs and Extractor-friendly
[Chris Van Houtte (@cvanhoutte)]
- Added necessary gsims to run the Canterbury Seismic Hazard Model
in Gerstenberger et al. (2014) - Added a new gsim file mcverry_2006_chch.py to have the Canterbury-
specific classes. - Added a new gsim file bradley_2013b.py to implement the
Christchurch-specific modifications to the Bradley2013 base model.
[Michele Simionato (@micheles)]
- Added a check on the intensity measure types and levels in the job.ini,
to make sure they are ordered by period - Reduced the number of client sockets to the DbServer that was causing
(sporadically) the hanging of calculations on Windows - Extended the WebAPI to be able to extract specific hazard curves, maps
and UHS (i.e. IMT-specific and site specific) - Removed the realization index from the event loss table export, since
is it redundant - Forced all lowercase Python files in the engine codebase
- Removed the dependency from nose
[Robin Gee (@rcgee)]
- Updated GMPE of Yu et al. (2013)
[Michele Simionato (@micheles)]
- Added an
Extractor
client class leveraging the WebAPI and enhanced
oq plot_hmaps
to display remote hazard maps - Added a check when disaggregation is attempted on a source model
with atomic source groups - Implemented serialization/deserialization of GSIM instances to TOML
- Added a check against mispelled rupture distance names and fixed
the drouet_alpes_2015 GSIMs - Changed the XML syntax used to define dictionaries IMT -> GSIM
- Now GSIM classes have an
.init()
method to manage notrivial
initializations, i.e. expensive initializations or initializations
requiring access to the filesystem - Fixed a bug in event based that made it impossible to use GMPETables
- Associated the events to the realizations even in scenario_risk: this
involved changing the generation of the epsilons in the case of asset
correlation. Now there is a single aggregate losses output for all
realizations - Removed the rlzi column from the GMF CSV export
- Introduced a new parameter
ebrisk_maxweight
in the job.ini - For classical calculations with few sites, store information about the
realization closest to the mean hazard curve for each site - Removed the max_num_sites limit on the event based calculator
[Valerio Poggi (@klunk386)]
- Added an AvgSA intensity measure type and a GenericGmpeAvgSA which is
able to use it
[Michele Simionato (@micheles)]
- Introduced the ability to launch subtasks from tasks
- Stored rupture information in classical calculations with few sites
[Chris Van Houtte (@cvanhoutte)]
- Adding conversion from geometric mean to larger horizontal component in
bradley_2013.py
[Michele Simionato (@micheles)]
- Fixed a bug in applyToSources for the case of multiple sources
- Moved the prefiltering on the workers to save memory
- Exported the aggregated loss ratios in avg losses and agg losses
- Removed the variables quantile_loss_curves and mean_loss_curves: they
were duplicating quantile_hazard_curves and mean_hazard_curves - Only ruptures boundingbox-close to the site collection are stored
[Marco Pagani (@mmpagani)]
- Added cluster model to classical PSHA calculator
[Michele Simionato (@micheles)]
- Fixed a bug in scenario_damage from ShakeMap with noDamageLimit=0
- Avoided the MemoryError in the controller node by speeding up the saving
of the information about the sources - Turned utils/reduce_sm into a proper command
- Fixed a wrong coefficient in the ShakeMap amplification
- Fixed a bug in the hazard curves export (the filename did not contain
the period of the IMT thus producing duplicated files) - Parallelized the reading of the exposure
[Marco Pagani (@mmpagani)]
- Fixed the implementation on mutex ruptures
[Michele Simionato (@micheles)]
- Changed the aggregated loss curves exporter
- Added an experimental calculator ebrisk
- Changed the ordering of the events (akin to a change of seed in the
asset correlation)
[Robin Gee (@rcgee)]
- Fixed bug in tusa_langer_2016.py BA08SE model - authors updated b2 coeff
- Fixed bug in tusa_langer_2016.py related to coeffs affecting Repi models
[Michele Simionato (@micheles)]
- Added a check to forbid to set
ses_per_logic_tree_path = 0
- Added an API
/extract/event_info/eidx
- Splitting the sources in classical calculators and not in event based
- Removed
max_site_model_distance
- Extended the logic used in event_based_risk - read the hazard sites
from the site model, not from the exposure - to all calculators - In classical_bcr calculations with a CSV exposure the retrofitted field
was not read. Now a missing retrofitted value is an error
OpenQuake Engine 3.3.2
[Robin Gee (@rcgee)]
- Fixed bug in tusa_langer_2016.py BA08SE model - authors updated b2 coeff
[Michele Simionato (@micheles)]
- Fixed a bug in scenario_damage from ShakeMap with noDamageLimit=0
- Avoided the MemoryError in the controller node by speeding up the saving
of the information about the sources - Fixed a wrong coefficient in the ShakeMap amplification
- Fixed a bug in the hazard curves export (the filename did not contain
the period of the IMT thus producing duplicated files)