Releases: gem/oq-engine
OpenQuake Engine 2.4.0
[Michele Simionato (@micheles)]
- Now the command
oq export loss_curves/rlz-XXX
works both for the
classical_risk
calculator and theevent_based_risk
calculator
[Daniele Viganò (@daniviga)]
- Remove the default 30 day-old view limit in the WebUI calculation list
[Michele Simionato (@micheles)]
- Fixed a broken import affecting the command
oq upgrade_nrml
- Made it possible to specify multiple file names in
in the source_model_logic_tree file - Reduced the data transfer in the object
RlzsAssoc
and improved the
postprocessing of hazard curves when the option--hc
is given - Changed the
ruptures.xml
exporter to export unique ruptures - Fixed a bug when downloading the outputs from the WebUI on Windows
- Made
oq info --report
fast again by removing the rupture fine filtering - Improved the readibility of the CSV export
dmg_total
- Removed the column
eid
from the CSV exportruptures
; also
renamed the fieldserial
torup_id
and reordered the fields - Changed the event loss table exporter: now it exports an additional
column with therup_id
- Changed scenario npz export to export also the GMFs outside the maximum
distance - Fixed scenario npz export when there is a single event
- Replaced the event tags with numeric event IDs
- The mean hazard curves are now generated by default
- Improved the help message of the command
oq purge
- Added a
@reader
decorator to mark tasks reading directly from the
file system - Removed the .txt exporter for the GMFs, used internally in the tests
- Fixed a bug with relative costs which affected master for a long time,
but not the release 2.3. The insured losses were wrong in that case. - Added an .hdf5 exporter for the asset loss table
- Loss maps and aggregate losses are computed in parallel or sequentially
depending if the calculation is a postprocessing calculation or not - Deprecated the XML risk exporters
- Removed the .ext5 file
- Restored the parameter
asset_loss_table
in the event based calculators - Added a full .hdf5 exporter for
hcurves-rlzs
- Removed the
individual_curves
flag: now by default only the statistical
hazard outputs are exported - Saved a lot of memory in the computation of the hazard curves and stats
- Renamed the parameter
all_losses
toasset_loss_table
- Added an experimental version of the event based risk calculator which
is able to use GMFs imported from an external file - Added a
max_curve
functionality to compute the upper limit of the
hazard curves amongst realizations - Raised an error if the user specifies
quantile_loss_curves
orconditional_loss_poes
in a classical_damage calculation - Added a CSV exporter for the benefit-cost-ratio calculator
- The classical_risk calculator now reads directly the probability maps,
not the hazard curves - Turned the loss curves into on-demand outputs
for the event based risk calculator - The loss ratios are now stored in the datastore and not in an
external .ext5 file - The engine outputs are now streamed by the WebUI
- Used a temporary export directory in the tests, to avoid conflicts
in multiuser situations - Added an .npz exporter for the loss maps
- Raised an error early when using a complex logic tree in scenario
calculations - Changed the CSV exporter for the loss curves: now it exports all the
curves for a given site for the classical_risk calculator - Fixed the save_ruptures procedure when there are more than 256
surfaces in the MultiSurface - Renamed the
csq_
outputs of the scenario_damage tolosses_
- Changed the way scenario_damage are stored internally to be more
consistent with the other calculators - Removed the GSIM from the exported file name of the risk outputs
- New CSV exporter for GMFs generated by the event based calculator
- The event IDs are now unique and a constraint on the maximum
number of source groups (65,536) has been added - Added an output
losses_by_event
to the scenario_risk calculator - Changed the output
ruptures.csv
to avoid duplications - Added an output
losses_by_taxon
to the scenario_risk calculator - Fixed a performance bug in
get_gmfs
: now the scenario risk and damage
calculators are orders of magnitude faster for big arrays - Added an export test for the event loss table in the case of multiple TRTs
- Removed the experimental
rup_data
output - Added an .npz export for the output
losses_by_asset
- Exported the scenario_risk aggregate losses in a nicer format
[Daniele Viganò (@daniviga)]
- The 'oq webui' command now works on a multi-user installation
- Splitted RPM packages into python-oq-engine (single node) and
python-oq-engine-master/python-oq-engine-worker (multi-node)
[Paolo Tormene (@ptormene)]
- The 'Continue' button in the Web UI is now available also for risk
calculations
[Michele Simionato (@micheles)]
- Fixed a Python 3 bug in the WebUI when continuing a calculation: the
hazard_calculation_id was passed as a string and not as an integer - Changed to rupture storage to use variable length-arrays, with a speedup
of two orders of magnitude - Avoided storing twice the rupture events
- Optimized the serialization of ruptures on HDF5 by using a
sids
output - Changed the Web UI button from "Run Risk" to "Continue"
- The
avg
field in the loss curves is computed as the integral of the curve
again, and it is not extracted from the avg_losses output anymore - Made the
fullreport
exportable - Fixed the
rup_data
export, since the boundary field was broken - Restored the output
losses_by_taxon
in the event_based_risk calculator - Fixed the calculator event based UCERF so that average losses can
be stored
[Daniele Viganò (@daniviga)]
- Added a check to verify that an 'oq' client is talking to the
right DbServer instance - Introduced an optional argument for 'oq dbserver' command line
to be able to override its default interface binding behaviour
[Michele Simionato (@micheles)]
- Optimized the event based calculators by reducing the number of calls
to the GmfComputer and by using larger arrays - Added a check on missing vulnerability functions for some loss type
for some taxonomy - Now we save the GMFs on the .ext5 file, not the datastore
- Fixed bug in event_based_risk: it was impossible to use vulnerability
functions with "PM" distribution - Fixed bug in event_based_risk: the ebrisk calculator is required as
precalculator of event_based_risk, not others - Fixed bug in scenario_risk: the output
all_losses-rlzs
was aggregated
incorrectly - Now the ucerf_risk calculators transfer only the events, not the ruptures,
thus reducing the data transfer of several orders of magnitude - Added a view
get_available_gsims
to the WebUI and fixed the API docs - Introduced a configuration parameter
max_site_model_distance
with default
of 5 km - Implemented sampling in the UCERF event based hazard calculator
[Daniele Viganò (@daniviga)]
- Use threads instead of processes in DbServer because SQLite3
isn't fork-safe on macOS Sierra
[Michele Simionato (@micheles)]
- Fixed a TypeError when deleting a calculation from the WebUI
- Extended the command
oq to_hdf5
to manage source model files too - Improved significantly the performance of the event based calculator
when computing the GMFs and not the hazard curves - Stored information about the mean ground motion in the datastore
- Saved the rupture mesh with 32 floats instead of 64 bit floats
- Raised the limit on the event IDs from 2^16 to 2^32 per task
- Fixed classical_risk: there was an error when computing the statistics
in the case of multiple assets of the same taxonomy on the same site - Changed the UCERF event based calculators to parallelize by SES
- Fixed a site model bug: when the sites are extracted from the site model
there is no need to perform geospatial queries to get the parameters - Added a command
oq normalize
to produce goodsites.csv
files - Introduced a
ses_seed
parameter to specify the seed used to generate
the stochastic event sets;random_seed
is used for the sampling only - Changed the
build_rcurves
procedure to read the loss ratios directly from
the workers
OpenQuake Engine 2.3.0
[Michele Simionato (@micheles)]
oq info --report
now filters the ruptures and reports the correct
number of effective ruptures even for classical calculators- Stripped the TRT information from the event loss table CSV export
and optimized its performance - Fixed a bug when storing the GMPE logic tree file in the datastore
- Added a command
oq run_tiles
(experimental) - Fixed the event based calculator so that it can run UCERF ruptures
- Fixed a bug in the scenario_risk calculator in case of multiple assets
of the same taxonomy on the same site with no insurance losses - Now the event IDs are generated in the workers in the event based calculator
and there is a limit of 65536 tasks with 65536 ruptures each - Changed the UCERF classical calculators to compute one branch at the time
- Fixed the header
occupants:float32
in the CSV risk exports involving
occupants - Fixed the name of the zipped files downloaded by the Web UI: there
was a spurious dot - Fixed the UCERF classical calculator in the case of sampling
- Reduced the size of the event tags in the event based calculators, thus
saving GB of disk space in UCERF calculations - Fixed the name of the files downloaded by the Web UI: they must not
contain slashes - Now deleting a calculation from the Web UI really deletes it, before
if was only hiding it
[Daniele Viganò (@daniviga)]
- Moved the OpenQuake Engine manual sources inside doc/manual
[Michele Simionato (@micheles)]
- Introduced an experimental classical time dependent UCERF calculator
- Added a dynamic output for source group information
- Changed the UCERF rupture calculator to fully store the ruptures
- Fixed a bug in
combine_maps
: realizations with zero probability were
discarded, thus breaking the computation of the statistics - Added a command
oq reset
to reset database and datastores - Reduced the data transfer back and disk space occupation for UCERF
event based risk calculations - Tasks meant to be used with a shared directory are now marked with a
boolean attribute.shared_dir_on
- Added a warning when running event based risk calculations with sampling
- Made sure that the openquake.cfg file is read only once
[Daniele Viganò (@daniviga)]
- Moved the openquake.cfg config file inside the python package
under openquake/engine/openquake.cfg - Removed support to OQ_LOCAL_CFG_PATH and OQ_SITE_CFG_PATH vars;
only the OQ_CONFIG_FILE enviroment variable is read
[Michele Simionato (@micheles)]
- If there is a single realization, do not compute the statistics
- Changed the separator from comma to tab for the output
ruptures
- If there are no conditional_loss_poes, the engine does not try to
export the loss maps anymore - Fixed
oq engine --make-html-report
when using Python 3 - Fixed bug when running
oq info job.ini
with NRML 0.5 source models
OpenQuake Engine 2.2.0
[Michele Simionato (@micheles)]
- Fixed an HDF5 bug by not using a
vstr
array for the asset references - Fixed a wrong error message generated by
oq purge
- Added information about the rupture in the event loss table exports
- Fixed a bug and added a test calculation with nonparametric sources
- Fixed the classical UCERF calculator when there is more than one branch
- Added .npz exporter for gmf_data for event based calculations
[Daniele Viganò (@daniviga)]
- Port WebUI/API server to Django 1.9 and 1.10
- Add dependencies to setup.py
- Update Copyright to 2017
[Michele Simionato (@micheles)]
- Increased the splitting of ComplexFaultSources
- Added a way to reuse the CompositeSourceModel from a previous computation
- Turned the loss maps into dynamically generated outputs
- Extended the source model writer to serialize the attributes
src_interdep, rup_interdep, srcs_weights - Fixed a bug when exporting the uniform hazard spectra in presence of
IMTs non spectral acceleration - Fixed a bug when computing the loss maps in presence of insurance,
temporarily introduced in master - Made the datastore for event based risk calculations much lighter
by computing the statistical outputs at export time - Now it is possible to post process event based risk outputs with the
--hc
option - Added a command
oq to_hdf5
to convert .npz files into .hdf5 files - Moved commonlib.parallel into baselib
- Merged the experimental calculator ebrisk into event_based_risk and
used correctly the random_seed for generating the GMFs (not the master_seed) - Added a flag
ignore_covs
to ignore the coefficients of variation - Changed the GMF scenario exporter to avoid generating composite arrays with
a large number of fields - Exporting in .npz format rather than HDF5
- Introduced a
shared_dir
parameter in openquake.cfg - Fixed a serialization bug for planar surfaces
- Removed the flag
asset_loss_table
: the loss ratios are
saved if and only if theloss_ratios
dictionary is non-empty - Added a CSV exporter for the GMFs in the event based calculator
- Added a CSV exporter for the rup_data output
- Added a CSV exporter for the disaggregation output
- Stored the disaggregation matrices directly (no pickle)
- Turned the CompositeRiskModel into a HDF5-serializable object
- Fixed all doctests for Python 3
[Daniele Viganò (@daniviga)]
- Removed the 'oq-engine' wrapper (command already deprecated)
[Michele Simionato (@micheles)]
- Assigned a year label to each seismic event in the event based calculator
- Now the ebrisk calculator supports the case of asset_correlation=1 too
- Made it possible to export the losses generated by a specific event
- Lowered the limit on the length of source IDs to 60 chars
- Fixed excessive strictness when validating
consequenceFunction.id
- Added an
ucerf_rupture
calculator able to store seismic events and
rupture data and reduced the data transfer
[Daniele Viganò (@daniviga)]
- MANIFEST now includes all files, with any extension located in the
tests folders. It is now possible to run tests from an installation
made with packages
[Michele Simionato (@micheles)]
- Improved error message when the user gives a source model file instead of
a source model logic tree file - Fixed the management of negative calculation IDs
- Relaxed the tolerance so that the tests pass on Mac OS X
- Implemented csv exporter for the ruptures
- Optimized the epsilon generation in the ebrisk calculator for
asset_correlation=0 - Improved the performance of the scenario risk calculators
- Now by default we do not save the ruptures anymore
- Fixed a memory leak recently introduced in parallel.py
- Simplified classical_risk (the numbers can be slightly different now)
- Serialized the ruptures in the HDF5 properly (no pickle)
- Introduced a parameter
iml_disagg
in the disaggregation calculator - Fixed
oq reduce
to preserve the NRML version - Fixed a bug when splitting the fault sources by magnitude
OpenQuake Engine 2.1.1
[Michele Simionato (@micheles)]
- Fixed a bug when splitting the fault sources by magnitude
OpenQuake Engine 2.1.0
[Michele Simionato (@micheles)]
- There is now a flag
save_ruptures
that can be turned off on demand;
by default the ruptures are always saved in the event based calculators - Optimized the memory consumption when using a ProcessPoolExecutor (i.e
fork before reading the source model) by means of awakeup
task - Reduced the splitting of the fault sources
- Added a view
task_slowest
displaying info about the slowest task
(only for classical calculations for the moment) - concurrent_tasks=0 disable the concurrency
- Optimized the saving time of the GMFs
- Changed the default number of concurrent tasks and increased the
relative weight of point sources and area sources - Fixed the UCERF event loss table export and added a test for it
- Optimized the computation of the event loss table
- Introduced two new calculators ucerf_risk and ucerf_risk_fast
[Paolo Tormene (@ptormene)]
- Added to the engine server the possibility to log in and out
programmatically by means of HTTP POST requests
[Michele Simionato (@micheles)]
- Optimized the memory consumption of the event based risk calculators
- Extended the
oq show
command to work in a multi-user environment - Improved the test coverage of the exports in the WebUI
- Removed the SourceManager: now the sources are filtered in the workers
and we do not split in tiles anymore - Made the full datastore downloadable from the WebUI
- Added a command "oq db" to send commands the engine database
(for internal usage) - By default the WebUI now displays only the last 100 calculations
- Added more validity checks to the disaggregation parameters; split the
sources even in the disaggregation phase - Added an optimized event based calculator computing the total losses by
taxonomy and nothing else - Filtered the sources up front when there are few sites (<= 10)
- Reduced the number of tasks generated when filter_sources is False
- Saved engine_version and hazardlib_version as attributes of the datastore
- Avoided saving the ruptures when ground_motion_fields is True
- Finalized the HDF5 export for hazard curves, hazard maps and uniform
hazard spectra - Restored a weight of 1 for each rupture in the event based calculator
- Removed the MultiHazardCurveXMLWriter
- Improved the saving of the ruptures in event based calculations
- Reduced the data transfer due to the
rlzs_by_gsim
parameter - Added an HDF5 export for scenario GMFs
- If
filter_sources
if false, the light sources are not filtered, but the
heavy sources are always filtered - Now the dbserver can be stopped correctly with CTRL-C
- Parallelized the splitting of heavy sources
- Changed the event loss table exporter: now a single file per realization
is exported, containing all the loss types - Removed the dependency from the Django ORM
- Now the WebUI restarts the ProcessPoolExecutor at the end of each job,
to conserve resources - Optimized the computation of hazard curves and statistics, especially
for the memory consumption - Reduced the data transfer due to the
rlzs_assoc
andoqparam
objects - Fixed a bug in the disaggregation calculator when a source group has
been filtered away by the maximum distance criterium - Fixed an encoding error in the reports when the description contains a
non-ASCII character - Changed the distribution framework: celery is supported in a way more
consistent with the other approaches; moreover, ipyparallel is supported - Hazard maps are now a fake output, dynamically generated at export time
- Made the number of produced tasks proportional to the number of tiles
- Raised an error for event_based_risk producing no GMFs
- Added a view for the slow sources
- Transmitted the attributes of a SourceGroup to the underlying sources
- Fixed the names of exported files for hazard maps in .geojson format
- Added an header with metadata to the exported hazard curves and maps
- Avoid storing filtered-away probability maps, thus fixing a bug
- Restored the precalculation consistency check that was disabled during the
transition to engine 2.0 - Fixed a bug with
oq engine --delete-calculation
- Hazard curves/maps/uniform spectra can now be recomputed
- Restored the early check on missing taxonomies
- Raise an early error if an user forget the
rupture_mesh_spacing
parameter - Fixed a bug while deleting jobs from the db in Ubuntu 12.04
- Ported the shapefile converter from the nrml_converters
- Added source model information in the file
realizations.csv
oq engine --run job.ini --exports csv
now also exports the realizations- Introduced the format NRML 0.5 for source models
- Added a check on the version in case of export errors
- Extended
oq purge
to remove calculations from the database too - Fixed
--make-html-report
: the view task_info was not registered - Stored several strings as HDF5-variable-length strings
- Fixed an export bug for the hazard curves in .geojson format
- Removed the array cost_types from the datastore
- Taxonomies with chars not in the range a-z0-9 were incorrectly rejected
- Improved the XML parsing utilities in speed, memory, portability and
easy of use - Forbidden the reuse of exposure because is was fragile and error prone
- Fixed a bug with the
realizations
array, which in hazard calculations
was empty in the datastore
OpenQuake Engine 2.0.1
[Michele Simionato (@micheles)]
- Fixed a bug for tectonic region types filtered away
OpenQuake Engine 2.0
[Michele Simionato (@micheles)]
- Quoted the taxonomies in the CSV exports
- Fixed a bug in classical_damage and added a master test for it
- Fixed the escaping of the taxonomies in the datastore
- Fixed the names of the exported risk files
- Fixed a segfault in the WebUI when exporting files with h5py >= 2.4
- Added a command
oq dbserver
to start/stop the database server - The engine exports the hazard curves one file per IMT
- Exported lon and lat with 5 digits after the decimal point
- Added a command
oq info --build-reports
- Introduced experimental support for exporting .hdf5 files
[Daniele Viganò (@daniviga)]
- Reworked substantially the engine documentation: removed obsolete pages,
updated to engine 2.0 and added instructions for Windows and Mac OS X - Remove oq_create_db script, db is created by the DbServer
- Move oq_reset_db into utils and clean old code
[Michele Simionato (@micheles)]
- Now the DbServer automatically upgrades the database if needed
- Renamed oq-lite -> oq and added a subcommand
oq engine
- Added a CSV reader for the hazard curves
- Having time_event=None in the hazard part of a calculation is now valid
- Added an exporter for the rupture data, including the occurrence rate
- Refactored the CSV exporters
- Moved celeryconfig.py; now celery must be started with
celery worker --config openquake.engine.celeryconfig
- Added a default location
~/oqdata/dbserver.log
for the DbServer log - Added an early check on the SA periods supported by the GSIMs
- Now the gsim_logic_tree file is parsed only once
- Added a document about the architecture of the engine
- The realizations are now exported as a CSV file
- Escaped taxonomies in the datastore
- The Web UI log tool is now escaping the HTML
- Moved openquake.commonlib.commands -> openquake.commands and
openquake.commonlib.valid -> openquake.risklib.valid to have a
linear tower of internal dependencies - Supported all versions of Django >= 1.5
- Provided a better error message in the absence of openquake.cfg
- Removed the check on the export_dir when using the WebUI
- Reduce the data transfer of the realization association object
- If uniform_hazard_spectra is true, the UHS curves are generated
even if hazard_maps is false; the hazard maps are not exported - Optimized the filtering of PointSources
- Initial work on the UCERF event based hazard calculator
- Added a test calculation crossing the International Date Line (Alaska)
[Daniele Viganò (@daniviga)]
- Remove the dependency from the python 'pwd' package which is not
available on Windows - Supervisord init scripts are now provided for the dbserver, celery
and the webui. Celery is not started by default, other two are.
[Michele Simionato (@micheles)]
- Another export fix: made sure it is run by the current user
- Fixed the export: if the export directory does not exists, it is created
- Introduced the configuration variable
multi_user
, false for source
installations and true for package installations - Fixed the WebUI export
- Removed the .txt outputs from the WebUI page engine/<output_id>/outputs
(they are useful only internally) - Fixed the export: first the xml exporter is tried and then the csv exporter;
if both are available, only the first is used, not both of them - Optimized the case when the epsilons are not required, i.e. all the
covariance coefficients are zero in the vulnerability functions - Added another test for event based risk (
case_miriam
) - Revisited the distribution mechanism and refined the weight of the
ruptures in the event based calculators to avoid generating slow tasks - Added an automatic help for the subcommands of oq-lite and managed
--version correctly - The event based risk calculator now use different seeds for different
realizations; also, the performance has been substantially improved - Improved the .rst reports with data transfer information
- Removed the RlzsAssoc object from the datastore
- Fixed the number of tasks generated by the risk calculators
- Refactored the serialization of CompositionInfo instances to HDF5
- Used exponential notation with 5 decimal digits in most exported XML files
- Refactored the sampling mechanics in the event based calculators
- The event_based_risk calculator infers the minimum intensity of the GMFs
from the vulnerability functions (if not specified in the job.ini) - Fixed the
avg_losses-stats
: they were not generated in absence of
loss curves - Added a command
oq-lite info --exports
- Added filtering on the mininum intensity also in the event based
hazard calculator; improved the performance and memory occupation - Added a view displaying the calculation times by source typology
- Fixed the test of GMPETable after the correction in hazardlib
- Optimized the saving of the asset loss table
- Optimized the case of multiple assets of the same taxonomy on the
same point and introduced a datastore viewassets_by_site
- Fixed HDF5 segmentation faults in the tests for Ubuntu 16.04
[Daniele Viganò (@daniviga)]
- Add support for Ubuntu 16.04 (xenial) packages
- Removed the openquake_worker.cfg file because it is not used anymore
[Michele Simionato (@micheles)]
- Replaced PostgreSQL with SQLite
- Introduced a dbserver to mediate the interaction with the database
- Restored the signal handler to manage properly
kill
signals so that
the workers are revoked when a process is killed manually - Fixed in a more robust way the duplicated log bug
- Made more robust the killing of processes by patching concurrent.futures
- Fixed a critical bug with celery not being used even when
use_celery
was true. - Improved the validation of NRML files
- Added a command
oq-engine --show-log <job_id>
[Daniele Viganò (@daniviga)]
- Use the 'postgresql' meta package as dependency of the .deb
package to support newer versions of Postgres; this makes
Trusty package installable on Ubuntu 16.04 and Debian 8
[Daniele Viganò (@daniviga), Michele Simionato (@micheles)]
- Fixed a bug in
oq-engine --export-outputs
[Daniele Viganò (@daniviga), Matteo Nastasi (@nastasi-oq)]
- Allow installation of the binary package on Ubuntu derivatives
[Matteo Nastasi (@nastasi-oq)]
- Backport of libhdf5 and h5py for ubuntu 'precise' serie
[Michele Simionato (@micheles)]
- Removed openquake/engine/settings.py
- Made the dependency on celery required only in cluster installations
- Integrated the authentication database in the engine server database
- Fixed the description in the Web UI (before it was temporarily set to
the string "A job"). - Introduced filtering on the minimum intensity of the ground shaking
- Solved the issue of serializing large SES collections, over the HDF5 limit
- The loss maps and curves XML exporters now export the coordinates
of the assets, not the coordinates of the closest hazard site - Stored the job.ini parameters into a table in the datastore
- Added a check on the IMTs coming from the risk models
- Changed the aggregate loss table exporter to export the event tags,
not the event IDs - Fixed a bug with the CSV export of the ground motion fields
- Fixed a bug with the export of UHS curves with
--exports=xml
- Reduced substantially the data transfer and the memory occupation
for event based calculations with a large number of assets: we
can run the California exposure with half million assets now - Fixed a bug in the SESCollection exporter
- Changed the asset<->epsilons association: before for a given taxonomy the
assets were ordered byasset_ref
, now they are ordered byid
. This
has a minor impact on the numbers sensitive to the epsilons, akin to a
change of seeds - Added a test on the ordering of the epsilons
- Accepted
.
and|
as valid characters for source IDs - Changed the GMF calculator to use a single seed per unique rupture
- Changed the SESCollection exporter: now a single file is exported, before
we were exporting one file per source model path per tectonic region model - Changed the event based calculators to avoid duplicating ruptures
occurring more than once - Changed the risk calculators to work in blocks of assets on the same site
- Made it possible to set different integration distances for different
tectonic region types - Optimized the aggregation by asset in the event based risk calculator
- Reporting the source_id when the filtering fails
OpenQuake Engine 1.9.1
[Michele Simionato (@micheles)]
- Fixed a bug in the Web UI when running a risk calculation starting
from a previous calculation
OpenQuake Engine 1.9
[Michele Simionato (@micheles)]
- Fixed a bug such that in some circumstances the logging stream handler
was instantiated twice, resulting in duplicated logs - Changed the default job status to 'executing' (was 'pre_executing')
- Fixed the ordering of the logs in the Web UI
- Removed the dependency from PostGIS
- Restored the monitoring which was accidentally removed
- Removed the obsolete option
--hazard-output-id
- Printed the names of the files exported by the engine, even when there
are multiple files for a single output - Introduced four new tables job, output, log, performance: all the other
60+ database tables are not used anymore
OpenQuake Engine 1.8
[Michele Simionato (@micheles)]
- Removed two
oq-engine
switches (--export-stats
and--list-inputs
)
and fixed--show-view
; unified--delete-hazard-calculation
and
--delete-risk-calculation
into a single--delete-calculation
- Updated
make_html_report.py
to extract the full report from the
datastore - If
use_celery
is true, use celery to determine a good default for
the parameterconcurrent_tasks
- Made celery required only in cluster situations
- Fixed the duplication of exported result in the classical_damage
calculator when there is more than one realization - Removed several obsolete or deprecated switches from the
oq-engine
command - Replaced all classical calculators with their lite counterparts
- Fixed the site-ordering in the UHS exporter (by lon-lat)
[Paolo Tormene (@ptormene)]
- Added API to validate NRML
[Michele Simionato (@micheles)]
- The engine can now zip files larger than 2 GB (used in the export)
- Now the loss maps and curves are exported with a fixed ordering: first
by lon-lat, then by asset ID - Replaced the old disaggregation calculator with the oq-lite one