Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Postgis driver support #1032

Merged
merged 36 commits into from
Jul 11, 2024
Merged
Show file tree
Hide file tree
Changes from 32 commits
Commits
Show all changes
36 commits
Select commit Hold shift + click to select a range
ec05e69
[pre-commit.ci] pre-commit autoupdate (#1022)
pre-commit-ci[bot] Jun 12, 2024
26e5740
Refactor run_sql to be shared code. Fix and document permissions beha…
SpacemanPaul Jun 13, 2024
d25014e
Switch datacube-ows-update -e env option to -E env for consistency wi…
SpacemanPaul Jun 14, 2024
6a997f9
Build postgis test database, including migrating geodata_coast to eo3.
SpacemanPaul Jun 18, 2024
a9245dc
Add and register postgis index driver - no functionality implemented.
SpacemanPaul Jun 19, 2024
eb410a0
Schema creation and management for postgis indexes.
SpacemanPaul Jun 21, 2024
8999dda
Postgis implementation, end-to-end testing and cleanup.
SpacemanPaul Jul 3, 2024
65fba4f
Lintage/spelling and fix docker-compose for GH running.
SpacemanPaul Jul 3, 2024
a9c5ef2
Attempt to get multi-db docker-compose test framework working in GHA.
SpacemanPaul Jul 5, 2024
a37ed00
Fix pg_ready args.
SpacemanPaul Jul 5, 2024
8aaa7b9
xargs was manging pg_isready args.
SpacemanPaul Jul 5, 2024
50194fc
Still trying to get pg_isready args passed correctly
SpacemanPaul Jul 5, 2024
d2b6684
Ah need to replace DB_HOSTNAME.
SpacemanPaul Jul 5, 2024
01a74df
and typo.
SpacemanPaul Jul 5, 2024
72e00f5
Update check-all-code.sh
SpacemanPaul Jul 5, 2024
8ad4a9a
Separate probe db?
SpacemanPaul Jul 5, 2024
c073af5
Oh of course, datacube-ows-update can't be called until after we've s…
SpacemanPaul Jul 5, 2024
6908bd0
Haha. Oops.
SpacemanPaul Jul 5, 2024
a2a98f2
Core/OWS API fixes and workaround permissions bug in core.
SpacemanPaul Jul 8, 2024
fb35e4b
Lintage and mypy.
SpacemanPaul Jul 8, 2024
9a4dec1
Convert 4236 before clipping to CRS when generating CRS specific bboxes.
SpacemanPaul Jul 8, 2024
67cd11f
MyPy fix.
SpacemanPaul Jul 8, 2024
1f9b91a
Fix time-query handling.
SpacemanPaul Jul 9, 2024
f3d3893
Lintage.
SpacemanPaul Jul 9, 2024
688d9be
mypy fix.
SpacemanPaul Jul 9, 2024
23638e8
Fix postgis spatial extent method.
SpacemanPaul Jul 9, 2024
3c7bb54
Improve test coverage and remove unused code.
SpacemanPaul Jul 9, 2024
9e26150
Remove unused imports.
SpacemanPaul Jul 9, 2024
7a3ee65
Remove excess progress noise from datacube-ows-update --schema type c…
SpacemanPaul Jul 10, 2024
2fa19f8
Update documentation.
SpacemanPaul Jul 10, 2024
5e8ed81
Silly spelling word.
SpacemanPaul Jul 10, 2024
9785803
More spelling.
SpacemanPaul Jul 10, 2024
dcc5735
Cleanup, add more comments.
SpacemanPaul Jul 11, 2024
d1c9056
Cleanup, add more comments.
SpacemanPaul Jul 11, 2024
37426d7
Fix formatting of progress messages in run_sql().
SpacemanPaul Jul 11, 2024
4f0b5e0
Fix sql.py copypasta.
SpacemanPaul Jul 11, 2024
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
15 changes: 10 additions & 5 deletions .env_simple
Original file line number Diff line number Diff line change
Expand Up @@ -4,12 +4,17 @@
################
# ODC DB Config
# ##############
ODC_DEFAULT_DB_URL=postgresql://opendatacubeusername:opendatacubepassword@postgres:5432/opendatacube
ODC_DEFAULT_DB_URL=postgresql://opendatacubeusername:opendatacubepassword@postgres:5432/odc_postgres
ODC_OWSPOSTGIS_DB_URL=postgresql://opendatacubeusername:opendatacubepassword@postgres:5432/odc_postgis

# Needed for docker db image.
DB_PORT=5432
DB_USERNAME=opendatacubeusername
DB_PASSWORD=opendatacubepassword
DB_DATABASE=opendatacube
POSTGRES_PORT=5432
POSTGRES_HOSTNAME=postgres
POSTGRES_USER=opendatacubeusername
SERVER_DB_USERNAME=opendatacubeusername
POSTGRES_PASSWORD=opendatacubepassword
POSTGRES_DB="odc_postgres,odc_postgis"
READY_PROBE_DB=odc_postgis

#################
# OWS CFG Config
Expand Down
1 change: 0 additions & 1 deletion .github/workflows/lint.yml
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,6 @@ jobs:
- name: Install dependencies and run pylint
run: |
pip install .[test,dev]
pip install pylint
pylint -j 2 --reports no datacube_ows --disable=C,R,W,E1136

flake8:
Expand Down
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -15,7 +15,7 @@ repos:
# hooks:
# - id: bandit
- repo: https://github.com/PyCQA/pylint
rev: v3.1.0
rev: v3.2.3
hooks:
- id: pylint
args: ["--disable=C,R,W,E1136"]
6 changes: 6 additions & 0 deletions README.rst
Original file line number Diff line number Diff line change
Expand Up @@ -185,6 +185,10 @@ The following instructions are for installing on a clean Linux system.
export DATACUBE_OWS_CFG=datacube_ows.ows_cfg_example.ows_cfg
datacube-ows-update --write-role ubuntu --schema

# If you are not using the `default` ODC environment, you can specify the environment to create the schema in:

datacube-ows-update -E myenv --write-role ubuntu --schema


* Create a configuration file for your service, and all data products you wish to publish in
it.
Expand All @@ -199,7 +203,9 @@ The following instructions are for installing on a clean Linux system.

* When additional datasets are added to the datacube, the following steps will need to be run::

# Update the materialised views (postgis index driver only - can be skipped for the postgis index driver):
datacube-ows-update --views
# Update the range tables (both index drivers)
datacube-ows-update

* If you are accessing data on AWS S3 and running `datacube_ows` on Ubuntu you may encounter errors with ``GetMap``
Expand Down
44 changes: 37 additions & 7 deletions check-code-all.sh
Original file line number Diff line number Diff line change
Expand Up @@ -5,30 +5,45 @@ set -ex
# ensure db is ready
sh ./docker/ows/wait-for-db

# Initialise ODC schema
# Initialise ODC schemas

datacube system init
datacube -E owspostgis system init
datacube -E owspostgis spindex create 3577
datacube -E owspostgis system init

# Add extended metadata types

datacube metadata add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/product_metadata/eo3_landsat_ard.odc-type.yaml
datacube metadata add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/product_metadata/eo3_sentinel_ard.odc-type.yaml

datacube -E owspostgis metadata add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/product_metadata/eo3_landsat_ard.odc-type.yaml
datacube -E owspostgis metadata add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/product_metadata/eo3_sentinel_ard.odc-type.yaml

# Test products
datacube product add ./integration_tests/metadata/s2_l2a_prod.yaml
datacube product add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/products/baseline_satellite_data/c3/ga_s2am_ard_3.odc-product.yaml
datacube product add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/products/baseline_satellite_data/c3/ga_s2bm_ard_3.odc-product.yaml
datacube product add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/products/land_and_vegetation/c3_fc/ga_ls_fc_3.odc-product.yaml

datacube -E owspostgis product add ./integration_tests/metadata/s2_l2a_prod.yaml
datacube -E owspostgis product add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/products/baseline_satellite_data/c3/ga_s2am_ard_3.odc-product.yaml
datacube -E owspostgis product add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/products/baseline_satellite_data/c3/ga_s2bm_ard_3.odc-product.yaml
datacube -E owspostgis product add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/products/land_and_vegetation/c3_fc/ga_ls_fc_3.odc-product.yaml

# add flag masking products
datacube product add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/products/sea_ocean_coast/geodata_coast_100k/geodata_coast_100k.odc-product.yaml
datacube product add ./integration_tests/metadata/product_geodata_coast_100k.yaml
datacube product add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/products/inland_water/c3_wo/ga_ls_wo_3.odc-product.yaml

datacube -E owspostgis product add ./integration_tests/metadata/product_geodata_coast_100k.yaml
datacube -E owspostgis product add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/products/inland_water/c3_wo/ga_ls_wo_3.odc-product.yaml

# Geomedian for summary product testing

datacube product add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/products/baseline_satellite_data/geomedian-au/ga_ls8c_nbart_gm_cyear_3.odc-product.yaml
datacube -E owspostgis product add https://raw.githubusercontent.com/GeoscienceAustralia/dea-config/master/products/baseline_satellite_data/geomedian-au/ga_ls8c_nbart_gm_cyear_3.odc-product.yaml

# S2 datasets from us-west-2 (might not work)
# S2 datasets from us-west-2 and eo3ified geodata_coast
MDL=./integration_tests/metadata
python ${MDL}/metadata_importer.py <<EOF
${MDL}/s2_l2a_ds_01.yaml https://sentinel-cogs.s3.us-west-2.amazonaws.com/sentinel-s2-l2a-cogs/51/L/XD/2021/12/S2B_51LXD_20211231_0_L2A/S2B_51LXD_20211231_0_L2A.json
Expand Down Expand Up @@ -95,6 +110,8 @@ ${MDL}/s2_l2a_ds_61.yaml https://sentinel-cogs.s3.us-west-2.amazonaws.com/sentin
${MDL}/s2_l2a_ds_62.yaml https://sentinel-cogs.s3.us-west-2.amazonaws.com/sentinel-s2-l2a-cogs/51/L/XE/2021/12/S2B_51LXE_20211221_1_L2A/S2B_51LXE_20211221_1_L2A.json
${MDL}/s2_l2a_ds_63.yaml https://sentinel-cogs.s3.us-west-2.amazonaws.com/sentinel-s2-l2a-cogs/51/L/XE/2021/12/S2B_51LXE_20211221_0_L2A/S2B_51LXE_20211221_0_L2A.json
${MDL}/s2_l2a_ds_64.yaml https://sentinel-cogs.s3.us-west-2.amazonaws.com/sentinel-s2-l2a-cogs/51/L/YE/2021/12/S2B_51LYE_20211221_1_L2A/S2B_51LYE_20211221_1_L2A.json
${MDL}/COAST_100K_8_-21.yaml https://data.dea.ga.gov.au/projects/geodata_coast_100k/v2004/x_8/y_-21/COAST_100K_8_-21.yaml
${MDL}/COAST_100K_15_-40.yaml https://data.dea.ga.gov.au/projects/geodata_coast_100k/v2004/x_15/y_-40/COAST_100K_15_-40.yaml
EOF

# S2 multiproduct datasets
Expand All @@ -105,20 +122,33 @@ datacube dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/bas
datacube dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/baseline/ga_s2am_ard_3/52/LGM/2017/07/24/20170724T030641/ga_s2am_ard_3-2-1_52LGM_2017-07-24_final.odc-metadata.yaml --ignore-lineage
datacube dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/baseline/ga_s2am_ard_3/52/LGM/2017/08/03/20170921T103758/ga_s2am_ard_3-2-1_52LGM_2017-08-03_final.odc-metadata.yaml --ignore-lineage

# flag masking datasets
datacube dataset add https://data.dea.ga.gov.au/projects/geodata_coast_100k/v2004/x_15/y_-40/COAST_100K_15_-40.yaml
datacube dataset add https://data.dea.ga.gov.au/projects/geodata_coast_100k/v2004/x_8/y_-21/COAST_100K_8_-21.yaml
datacube -E owspostgis dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/baseline/ga_s2bm_ard_3/52/LGM/2017/07/19/20170719T030622/ga_s2bm_ard_3-2-1_52LGM_2017-07-19_final.odc-metadata.yaml --ignore-lineage
datacube -E owspostgis dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/baseline/ga_s2bm_ard_3/52/LGM/2017/07/29/20170729T081630/ga_s2bm_ard_3-2-1_52LGM_2017-07-29_final.odc-metadata.yaml --ignore-lineage
datacube -E owspostgis dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/baseline/ga_s2bm_ard_3/52/LGM/2017/08/08/20170818T192649/ga_s2bm_ard_3-2-1_52LGM_2017-08-08_final.odc-metadata.yaml --ignore-lineage
datacube -E owspostgis dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/baseline/ga_s2am_ard_3/52/LGM/2017/07/14/20170714T082022/ga_s2am_ard_3-2-1_52LGM_2017-07-14_final.odc-metadata.yaml --ignore-lineage
datacube -E owspostgis dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/baseline/ga_s2am_ard_3/52/LGM/2017/07/24/20170724T030641/ga_s2am_ard_3-2-1_52LGM_2017-07-24_final.odc-metadata.yaml --ignore-lineage
datacube -E owspostgis dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/baseline/ga_s2am_ard_3/52/LGM/2017/08/03/20170921T103758/ga_s2am_ard_3-2-1_52LGM_2017-08-03_final.odc-metadata.yaml --ignore-lineage

# flag masking datasets
datacube dataset add https://data.dea.ga.gov.au/derivative/ga_ls_wo_3/1-6-0/094/077/2018/02/08/ga_ls_wo_3_094077_2018-02-08_final.odc-metadata.yaml --ignore-lineage
datacube dataset add https://data.dea.ga.gov.au/derivative/ga_ls_fc_3/2-5-1/094/077/2018/02/08/ga_ls_fc_3_094077_2018-02-08_final.odc-metadata.yaml --ignore-lineage

datacube -E owspostgis dataset add https://data.dea.ga.gov.au/derivative/ga_ls_wo_3/1-6-0/094/077/2018/02/08/ga_ls_wo_3_094077_2018-02-08_final.odc-metadata.yaml --ignore-lineage
datacube -E owspostgis dataset add https://data.dea.ga.gov.au/derivative/ga_ls_fc_3/2-5-1/094/077/2018/02/08/ga_ls_fc_3_094077_2018-02-08_final.odc-metadata.yaml --ignore-lineage

# Geomedian datasets
datacube dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/derivative/ga_ls8c_nbart_gm_cyear_3/3-0-0/x17/y37/2019--P1Y/ga_ls8c_nbart_gm_cyear_3_x17y37_2019--P1Y_final.odc-metadata.yaml --ignore-lineage
datacube dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/derivative/ga_ls8c_nbart_gm_cyear_3/3-0-0/x17/y37/2020--P1Y/ga_ls8c_nbart_gm_cyear_3_x17y37_2020--P1Y_final.odc-metadata.yaml --ignore-lineage
datacube dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/derivative/ga_ls8c_nbart_gm_cyear_3/3-0-0/x17/y37/2021--P1Y/ga_ls8c_nbart_gm_cyear_3_x17y37_2021--P1Y_final.odc-metadata.yaml --ignore-lineage

datacube -E owspostgis dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/derivative/ga_ls8c_nbart_gm_cyear_3/3-0-0/x17/y37/2019--P1Y/ga_ls8c_nbart_gm_cyear_3_x17y37_2019--P1Y_final.odc-metadata.yaml --ignore-lineage
datacube -E owspostgis dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/derivative/ga_ls8c_nbart_gm_cyear_3/3-0-0/x17/y37/2020--P1Y/ga_ls8c_nbart_gm_cyear_3_x17y37_2020--P1Y_final.odc-metadata.yaml --ignore-lineage
datacube -E owspostgis dataset add https://dea-public-data.s3.ap-southeast-2.amazonaws.com/derivative/ga_ls8c_nbart_gm_cyear_3/3-0-0/x17/y37/2021--P1Y/ga_ls8c_nbart_gm_cyear_3_x17y37_2021--P1Y_final.odc-metadata.yaml --ignore-lineage

# create material view for ranges extents
datacube-ows-update --schema --write-role $DB_USERNAME
datacube-ows-update --schema --write-role $POSTGRES_USER --read-role $SERVER_DB_USERNAME

datacube-ows-update -E owspostgis --schema --write-role $POSTGRES_USER --read-role $SERVER_DB_USERNAME
datacube-ows-update

# Run tests, taking coverage.
Expand Down
16 changes: 7 additions & 9 deletions datacube_ows/data.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@

import logging
from datetime import date, datetime, timedelta
from typing import Any, cast
from typing import Any

import numpy
import numpy.ma
Expand All @@ -18,8 +18,7 @@
from rasterio.io import MemoryFile

from datacube_ows.http_utils import FlaskResponse, json_response, png_response
from datacube_ows.loading import DataStacker, ProductBandQuery
from datacube_ows.mv_index import MVSelectOpts
from datacube_ows.loading import DataStacker
from datacube_ows.ogc_exceptions import WMSException
from datacube_ows.ogc_utils import xarray_image_as_png
from datacube_ows.ows_configuration import OWSNamedLayer
Expand Down Expand Up @@ -102,7 +101,7 @@ def get_map(args: dict[str, str]) -> FlaskResponse:
stacker = DataStacker(params.layer, params.geobox, params.times, params.resampling, style=params.style)
qprof["zoom_factor"] = params.zf
qprof.start_event("count-datasets")
n_datasets = stacker.datasets(params.layer.dc.index, mode=MVSelectOpts.COUNT)
n_datasets = stacker.n_datasets()
qprof.end_event("count-datasets")
qprof["n_datasets"] = n_datasets
qprof["zoom_level_base"] = params.resources.base_zoom_level
Expand All @@ -113,8 +112,7 @@ def get_map(args: dict[str, str]) -> FlaskResponse:
stacker.resource_limited = True
qprof["resource_limited"] = str(e)
if qprof.active:
q_ds_dict = cast(dict[ProductBandQuery, xarray.DataArray],
stacker.datasets(params.layer.dc.index, mode=MVSelectOpts.DATASETS))
q_ds_dict = stacker.datasets()
qprof["datasets"] = []
for q, dsxr in q_ds_dict.items():
query_res: dict[str, Any] = {}
Expand All @@ -129,7 +127,7 @@ def get_map(args: dict[str, str]) -> FlaskResponse:
qprof["datasets"].append(query_res)
if stacker.resource_limited and not params.layer.low_res_product_names:
qprof.start_event("extent-in-query")
extent = cast(geom.Geometry | None, stacker.datasets(params.layer.dc.index, mode=MVSelectOpts.EXTENT))
extent = stacker.extent(crs=params.crs)
qprof.end_event("extent-in-query")
if extent is None:
qprof["write_action"] = "No extent: Write Empty"
Expand All @@ -149,10 +147,10 @@ def get_map(args: dict[str, str]) -> FlaskResponse:
else:
if stacker.resource_limited:
qprof.start_event("count-summary-datasets")
qprof["n_summary_datasets"] = stacker.datasets(params.layer.dc.index, mode=MVSelectOpts.COUNT)
qprof["n_summary_datasets"] = stacker.n_datasets()
qprof.end_event("count-summary-datasets")
qprof.start_event("fetch-datasets")
datasets = cast(dict[ProductBandQuery, xarray.DataArray], stacker.datasets(params.layer.dc.index))
datasets = stacker.datasets()
for flagband, dss in datasets.items():
if not dss.any():
_LOG.warning("Flag band %s returned no data", str(flagband))
Expand Down
7 changes: 2 additions & 5 deletions datacube_ows/feature_info.py
Original file line number Diff line number Diff line change
Expand Up @@ -158,7 +158,7 @@ def feature_info(args: dict[str, str]) -> FlaskResponse:
stacker = DataStacker(params.layer, geo_point_geobox, params.times)
# --- Begin code section requiring datacube.
cfg = get_config()
all_time_datasets = cast(xarray.DataArray, stacker.datasets(params.layer.dc.index, all_time=True, point=geo_point))
all_time_datasets = stacker.datasets_all_time(point=geo_point)

# Taking the data as a single point so our indexes into the data should be 0,0
h_coord = cast(str, cfg.published_CRSs[params.crsid]["horizontal_coord"])
Expand All @@ -174,10 +174,7 @@ def feature_info(args: dict[str, str]) -> FlaskResponse:
global_info_written = False
feature_json["data"] = []
fi_date_index: dict[datetime, RAW_CFG] = {}
time_datasets = cast(
dict[ProductBandQuery, xarray.DataArray],
stacker.datasets(params.layer.dc.index, all_flag_bands=True, point=geo_point)
)
time_datasets = stacker.datasets(all_flag_bands=True, point=geo_point)
data = stacker.data(time_datasets, skip_corrections=True)
if data is not None:
for dt in data.time.values:
Expand Down
24 changes: 23 additions & 1 deletion datacube_ows/index/api.py
Original file line number Diff line number Diff line change
Expand Up @@ -13,7 +13,7 @@
from datacube import Datacube
from datacube.index.abstract import AbstractIndex
from datacube.model import Product, Dataset
from odc.geo import Geometry, CRS
from odc.geo.geom import Geometry, CRS, polygon

from datacube_ows.config_utils import CFG_DICT, ConfigException

Expand Down Expand Up @@ -128,6 +128,7 @@ def extent(self,
products: Iterable[Product] | None = None,
crs: CRS | None = None
) -> Geometry | None:
geom = self._prep_geom(layer, geom)
if crs is None:
crs = CRS("epsg:4326")
ext: Geometry | None = None
Expand All @@ -148,6 +149,27 @@ def extent(self,
return ext.to_crs(crs)
return ext

def _prep_geom(self, layer: "OWSNamedLayer", any_geom: Geometry | None) -> Geometry | None:
if any_geom is None:
return None
if any_geom.geom_type == "Point":
any_geom = any_geom.to_crs(layer.native_CRS)
x, y = any_geom.coords[0]
delta_x, delta_y = layer.cfg_native_resolution
return polygon(
(
(x, y),
(x + delta_x, y),
(x + delta_x, y + delta_y),
(x, y + delta_y),
(x, y),
),
crs=layer.native_CRS
)
elif any_geom.geom_type in ("MultiPoint", "LineString", "MultiLineString"):
return any_geom.convex_hull
else:
return any_geom

class OWSAbstractIndexDriver(ABC):
@classmethod
Expand Down
Loading
Loading