Releases: Open-EO/openeo-python-client
Releases · Open-EO/openeo-python-client
openEO Python Client v0.15.0
Added
- The openeo Python client library can now also be installed with conda (conda-forge channel)
(#176) - Allow using a custom
requests.Session
inopeneo.rest.auth.oidc
logic
Changed
- Less verbose log printing on failed batch job #332
- Improve (UTC) timezone handling in
openeo.util.Rfc3339
and addrfc3339.today()
/rfc3339.utcnow()
.
openEO Python Client v0.14.1
Fixed
- Fine-tuned
XarrayDataCube
tests for conda building and packaging (#176)
openEO Python Client v0.14.0
Added
- Jupyter integration: show process graph visualization of
DataCube
objects instead of genericrepr
. (#336) - Add
Connection.vectorcube_from_paths()
to load a vector cube
from files (on back-end) or URLs withload_uploaded_files
process. - Python 3.10 and 3.11 are now officially supported
(test run now also for 3.10 and 3.11 in GitHub Actions, #346) - Support for simplified OIDC device code flow, (#335)
- Added MultiBackendJobManager, based on implementation from openeo-classification project
(#361) - Added resilience to MultiBackendJobManager for backend failures (#365)
Changed
execute_batch
also skips temporal502 Bad Gateway errors
. #352
Fixed
- Fixed/improved math operator/process support for
DataCube
s in "apply" mode (non-"band math"),
allowing expressions like10 * cube.log10()
and~(cube == 0)
(#123) - Support
PrivateJsonFile
permissions properly on Windows, using oschmod library.
(#198) - Fixed some broken unit tests on Windows related to path (separator) handling.
(#350)
openEO Python client v0.13.0
Added
- Add
max_cloud_cover
argument toload_collection()
to simplify setting maximum cloud cover (propertyeo:cloud_cover
) (#328)
Changed
- Improve default dimension metadata of a datacube created with
openeo.rest.datacube.DataCube.load_disk_collection
DataCube.download()
: only automatically addsave_result
node when there is none yet.- Deprecation warnings: make sure they are shown by default and can be hidden when necessary.
- Rework and improve
openeo.UDF
helper class for UDF usage
(#312).- allow loading directly from local file or URL
- autodetect
runtime
from file/URL suffix or source code - hide implementation details around
data
argument (e.g.data={"from_parameter": "x"}
) - old usage patterns of
openeo.UDF
andDataCube.apply_dimension()
still work but trigger deprecation warnings
- Show warning when using
load_collection
property filters that are not defined in the collection metadata (summaries).
openEO Python client v0.12.1
openEO Python client v0.12.0
Added
- Allow passing raw JSON string, JSON file path or URL to
Connection.download()
,
Connection.execute()
andConnection.create_job()
- Add support for reverse math operators on DataCube in
apply
mode (#323) - Add
DataCube.print_json()
to simplify exporting process graphs in Jupyter or other interactive environments (#324) - Raise
DimensionAlreadyExistsException
when trying toadd_dimension()
a dimension with existing name (Open-EO/openeo-geopyspark-driver#205)
Changed
DataCube.execute_batch()
now also guesses the output format from the filename,
and allows usingformat
argument next to the currentout_format
to align with theDataCube.download()
method. (#240)- Better client-side handling of merged band name metadata in
DataCube.merge_cubes()
Removed
openEO Python client v0.11.0
Added
- Add support for passing a PGNode/VectorCube as geometry to
aggregate_spatial
,mask_polygon
, ... - Add support for second order callbacks e.g.
is_valid
incount
inreduce_dimension
(#317)
Changed
- Rename
RESTJob
class name to less cryptic and more user-friendlyBatchJob
.
OriginalRESTJob
is still available as deprecated alias.
(#280) - Dropped default reducer ("max") from
DataCube.reduce_temporal_simple()
- Various documentation improvements:
- Drop hardcoded
h5netcdf
engine fromXarrayIO.from_netcdf_file()
andXarrayIO.to_netcdf_file()
(#314) - Changed argument name of
Connection.describe_collection()
fromname
tocollection_id
to be more in line with other methods/functions.
Fixed
- Fix
context
/condition
confusion bug withcount
callback inDataCube.reduce_dimension()
(#317)
openEO Python client v0.10.1 (LPS22 release)
Added
- Add
context
parameter toDataCube.aggregate_spatial()
,DataCube.apply_dimension()
,
DataCube.apply_neighborhood()
,DataCube.apply()
,DataCube.merge_cubes()
.
(#291) - Add
DataCube.fit_regr_random_forest()
(#293) - Add
PGNode.update_arguments()
, which combined withDataCube.result_node()
allows to do advanced process graph argument tweaking/updating without using._pg
hacks. JobResults.download_files()
: also download (by default) the job result metadata as STAC JSON file (#184)- OIDC handling in
Connection
: try to automatically refresh access token when expired (#298) Connection.create_job
raises exception if response does not contain a valid job_id- Add
openeo.udf.debug.inspect
for using the openEOinspect
process in a UDF (#302) - Add
openeo.util.to_bbox_dict()
to simplify building a openEO style bbox dictionary, e.g. from a list or shapely geometry (#304)
Removed
- Removed deprecated (and non-functional)
zonal_statistics
method from oldImageCollectionClient
API. (#144)
openEO Python client v0.10.0 (SRR3 release)
Added
- Add support for comparison operators (
<
,>
,<=
and>=
) in callback process building - Added
Connection.describe_process()
to retrieve and show a single process - Added
DataCube.flatten_dimensions()
andDataCube.unflatten_dimension
(Open-EO/openeo-processes#308, Open-EO/openeo-processes#316) - Added
VectorCube.run_udf
(to avoid non-standardprocess_with_node(UDF(...))
usage) - Added
DataCube.fit_class_random_forest()
andConnection.load_ml_model()
to train and load Machine Learning models
(#279) - Added
DataCube.predict_random_forest()
to easily usereduce_dimension
with apredict_random_forest
reducer
using aMlModel
(trained withfit_class_random_forest
)
(#279) - Added
DataCube.resample_cube_temporal
(#284) - Add
target_dimension
argument toDataCube.aggregate_spatial
(#288) - Add basic configuration file system to define a default back-end URL and enable auto-authentication (#264, #187)
- Add
context
argument toDataCube.chunk_polygon()
- Add
Connection.version_info()
to list version information about the client, the API and the back-end
Changed
- Include openEO API error id automatically in exception message to simplify user support and post-mortem analysis.
- Use
Connection.default_timeout
(when set) also on version discovery request - Drop
ImageCollection
fromDataCube
's class hierarchy. - This practically removes very old (pre-0.4.0) methods like
date_range_filter
andbbox_filter
fromDataCube
.
(#100, #278) - Deprecate
DataCube.send_job
in favor ofDataCube.create_job
for better consistency (internally and with other libraries) (#276) - Update (autogenerated)
openeo.processes
module to 1.2.0 release (2021-12-13) of openeo-processes - Update (autogenerated)
openeo.processes
module to draft version of 2022-03-16 (e4df8648) of openeo-processes - Update
openeo.extra.spectral_indices
to a post-0.0.6 version of Awesome Spectral Indices
Removed
- Removed deprecated 'zonal_statistics' method from the 1.x version of the API.
- Deprecate old-style
DataCube.polygonal_mean_timeseries()
,DataCube.polygonal_histogram_timeseries()
,
DataCube.polygonal_median_timeseries()
andDataCube.polygonal_standarddeviation_timeseries()
Fixed
openEO Python client v0.9.2
Added
- Add experimental support for
chunk_polygon
process (Open-EO/openeo-processes#287) - Add support for
spatial_extent
,temporal_extent
andbands
toConnection.load_result()
- Setting the environment variable
OPENEO_BASEMAP_URL
allows to set a new templated URL to a XYZ basemap for the Vue Components library,OPENEO_BASEMAP_ATTRIBUTION
allows to set the attribution for the basemap (#260) - Initial support for experimental "federation:missing" flag on partial openEO Platform user job listings (Open-EO/openeo-api#419)
- Best effort detection of mistakenly using Python builtin
sum
orall
functions in callbacks (Forum #113) - Automatically print batch job logs when job doesn't finish successfully (using
execute_batch/run_synchronous/start_and_wait
).