Releases: Open-EO/openeo-python-client
Releases · Open-EO/openeo-python-client
openEO Python Client v0.33.0
Added
- Added
DataCube.load_stac()
to also support creating aload_stac
based cube without a connection (#638) MultiBackendJobManager
: Addedinitialize_from_df(df)
(toCsvJobDatabase
andParquetJobDatabase
) to initialize (and persist) the job database from a given DataFrame.
Also addedcreate_job_db()
factory to easily create a job database from a given dataframe and its type guessed from filename extension.
(#635)MultiBackendJobManager.run_jobs()
now returns a dictionary with counters/stats about various events during the full run of the job manager (#645)- Added (experimental)
ProcessBasedJobCreator
to be used asstart_job
callable withMultiBackendJobManager
to create multiple jobs from a single parameterized process (e.g. a UDP or remote process definition) (#604)
Fixed
- When using
DataCube.load_collection()
without a connection, it is not necessary anymore to also explicitly setfetch_metadata=False
(#638)
openEO Python Client v0.32.0
Added
load_stac
/metadata_from_stac
: add support for extracting actual temporal dimension metadata (#567)MultiBackendJobManager
: addcancel_running_job_after
option to automatically cancel jobs that are running for too long (#590)- Added
openeo.api.process.Parameter
helper to easily create a "spatial_extent" UDP parameter - Wrap OIDC token request failure in more descriptive
OidcException
(related to #624) - Added
auto_add_save_result
option (on by default) to disable automatic addition ofsave_result
node ondownload
/create_job
/execute_batch
(#513) - Add support for
apply_vectorcube
UDF signature inrun_udf_code
([Open-EO/openeo-geopyspark-driver#881]Open-EO/openeo-geopyspark-driver#811) MultiBackendJobManager
: add API to the update loop in a separate thread, allowing controlled interruption.
Changed
MultiBackendJobManager
: changed job metadata storage API, to enable working with large databasesDataCube.apply_polygon()
: renamepolygons
argument togeometries
, but keep support for legacypolygons
for now (#592, #511)- Disallow ambiguous single string argument in
DataCube.filter_temporal()
(#628) - Automatic adding of
save_result
fromdownload()
orcreate_job()
: inspect whole process graph for pre-existingsave_result
nodes (related to #623, #401, #583) - Disallow ambiguity of combining explicit
save_result
nodes and implicitsave_result
addition fromdownload()
/create_job()
calls withformat
(related to #623, #401, #583)
Fixed
apply_dimension
with atarget_dimension
argument was not correctly adjusting datacube metadata on the client side, causing a mismatch.- Preserve non-spatial dimension metadata in
aggregate_spatial
(#612)
openEO Python Client v0.31.0
Added
- Add experimental
openeo.testing.results
subpackage with reusable test utilities for comparing batch job results with reference data MultiBackendJobManager
: add initial support for storing job metadata in Parquet file (instead of CSV) (#571)- Add
Connection.authenticate_oidc_access_token()
to set up authorization headers with an access token that is obtained "out-of-band" (#598) - Add
JobDatabaseInterface
to allow custom job metadata storage withMultiBackendJobManager
(#571)
openEO Python Client v0.30.0
Added
- Add
openeo.udf.run_code.extract_udf_dependencies()
to extract UDF dependency declarations from UDF code
(related to Open-EO/openeo-geopyspark-driver#237) - Document PEP 723 based Python UDF dependency declarations (Open-EO/openeo-geopyspark-driver#237)
- Added more
openeo.api.process.Parameter
helpers to easily create "bounding_box", "date", "datetime", "geojson" and "temporal_interval" parameters for UDP construction. - Added convenience method
Connection.load_stac_from_job(job)
to easily load the results of a batch job with theload_stac
process (#566) load_stac
/metadata_from_stac
: add support for extracting band info from "item_assets" in collection metadata (#573)- Added initial
openeo.testing
submodule for reusable test utilities
Fixed
- Initial fix for broken
DataCube.reduce_temporal()
afterload_stac
(#568)
openEO Python Client v0.29.0
openEO Python Client v0.28.0
Added
- Introduced superclass
CubeMetadata
forCollectionMetadata
for essential metadata handling (just dimensions for now) without collection-specific STAC metadata parsing. (#464) - Added
VectorCube.vector_to_raster()
(#550)
Changed
- Changed default
chunk_size
of variousdownload
functions from None to 10MB. This improves the handling of large downloads and reduces memory usage. (#528) Connection.execute()
andDataCube.execute()
now have aauto_decode
argument. If set to True (default) the response will be decoded as a JSON and throw an exception if this fails, if set to False the rawrequests.Response
object will be returned. (#499)
Fixed
- Preserve geo-referenced
x
andy
coordinates inexecute_local_udf
(#549)
openEO Python Client v0.27.0
Added
- Add
DataCube.filter_labels()
Changed
- Update autogenerated functions/methods in
openeo.processes
to definitions from openeo-processes project version 2.0.0-rc1. This removescreate_raster_cube
,fit_class_random_forest
,fit_regr_random_forest
andsave_ml_model
. Although removed from openeo-processes 2.0.0-rc1, support forload_result
,predict_random_forest
andload_ml_model
is preserved but deprecated. (#424) - Show more informative error message on
403 Forbidden
errors from CDSE firewall (#512) - Handle API error responses more strict and avoid hiding possibly important information in JSON-formatted but non-compliant error responses.
Fixed
openEO Python Client v0.26.0
Added
- Support new UDF signature:
def apply_datacube(cube: DataArray, context: dict) -> DataArray
(#310) - Add
collection_property()
helper to easily build collection metadata property filters forConnection.load_collection()
(#331) - Add
DataCube.apply_polygon()
(standardized version of experimentalchunk_polygon
) (#424) - Various improvements to band mapping with the Awesome Spectral Indices feature.
Allow explicitly specifying the satellite platform for band name mapping (e.g. "Sentinel2" or "LANDSAT8") if cube metadata lacks info.
Follow the official band mapping from Awesome Spectral Indices better.
Allow manually specifying the desired band mapping.
(#485, #501) - Also attempt to automatically refresh OIDC access token on a
401 TokenInvalid
response (in addition to403 TokenInvalid
) (#508) - Add
Parameter.object()
factory forobject
type parameters
Removed
- Remove custom spectral indices "NDGI", "NDMI" and "S2WI" from "extra-indices-dict.json"
that were shadowing the official definitions from Awesome Spectral Indices (#501)
Fixed
- Initial support for "spectral indices" that use constants defined by Awesome Spectral Indices (#501)
openEO Python Client v0.25.0
openEO Python Client v0.24.0
Added
- Add
DataCube.reduce_spatial()
- Added option (enabled by default) to automatically validate a process graph before execution.
Validation issues just trigger warnings for now. (#404) - Added "Sentinel1" band mapping support to "Awesome Spectral Indices" wrapper (#484)
- Run tests in GitHub Actions against Python 3.12 as well
Changed
- Enforce
XarrayDataCube
dimension order inexecute_local_udf()
to (t, bands, y, x)
to improve UDF interoperability with existing back-end implementations.