Skip to content

Commit

Permalink
Improve RFC 3339 datetime handling (#368)
Browse files Browse the repository at this point in the history
* improve RFC 3339 datetime handling

* install build-essential

* sudo, rearrange files

* gapt -> apt

* install build-essential in docker img

* move ciso8601 dep

* fix typing for parse_interval function

* fix import of parse_interval

* more changes for datetimes

* get the types right

* remove unnecessary parens

* remove classmethod

* add some interval tests

* replace rfc3339_str with pystac.utils.datetime_to_str

* fix search datetime parameter parsing to support empty string as open end of interval

* rename methods

* fix accidental method name repacement for parse_rfc3339

* update tests for double open ended temporal interval

* fix handling of empty string open-ended interval

* fix test that was successful with double-open-ended datetime interval to now fail

* replace ciso8601 with python-dateutil

* Revert "replace ciso8601 with python-dateutil"

This reverts commit 9f400f4.

* add pystac dependency to types

* add double-open-ended tests to pgstac tests

* skip mixed open-ended in pgstac

* skip datetime interval empty string in pgstac

* maybe just await the test

* Bump black version to avoid psf/black#2964

* lint

Co-authored-by: Nathan Zimmerman <[email protected]>
  • Loading branch information
Phil Varner and moradology authored Apr 14, 2022
1 parent c8c8819 commit 99ce774
Show file tree
Hide file tree
Showing 16 changed files with 267 additions and 98 deletions.
2 changes: 1 addition & 1 deletion .pre-commit-config.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -6,7 +6,7 @@ repos:
language_version: python3.8
-
repo: https://github.com/psf/black
rev: 20.8b1
rev: 22.3.0
hooks:
- id: black
args: ['--safe']
Expand Down
6 changes: 4 additions & 2 deletions Dockerfile.docs
Original file line number Diff line number Diff line change
@@ -1,19 +1,21 @@
FROM python:3.8-slim

# build-essential is required to build a wheel for ciso8601
RUN apt update && apt install -y build-essential

RUN python -m pip install --upgrade pip
RUN python -m pip install mkdocs mkdocs-material pdocs

COPY . /opt/src

WORKDIR /opt/src

RUN python -m pip install -e \
RUN python -m pip install \
stac_fastapi/api \
stac_fastapi/types \
stac_fastapi/extensions \
stac_fastapi/sqlalchemy


CMD ["pdocs", \
"as_markdown", \
"--output_dir", \
Expand Down
15 changes: 14 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
Expand Up @@ -62,6 +62,7 @@ pip install -e stac_fastapi/pgstac
```

## Local Development

Use docker-compose to deploy the application, migrate the database, and ingest some example data:
```bash
docker-compose build
Expand All @@ -73,11 +74,23 @@ docker-compose up app-sqlalchemy
docker-compose up app-pgstac
```

For local development it is often more convenient to run the application outside of docker-compose:
For local development it is often more convenient to run the application outside docker-compose:
```bash
make docker-run
```

Before commit, install the [pre-commit](https://pre-commit.com) hooks with:

```shell
pre-commit install
```

The pre-commit hooks can be run manually with:

```shell
pre-commit run --all-files
```

#### Note to Docker for Windows users

You'll need to enable experimental features on Docker for Windows in order to run the docker-compose, due to the "--platform" flag that is required to allow the project to run on some Apple architectures. To do this, open Docker Desktop, go to settings, select "Docker Engine", and modify the configuration JSON to have `"experimental": true`.
Expand Down
2 changes: 1 addition & 1 deletion stac_fastapi/api/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,7 +40,7 @@
"License :: OSI Approved :: MIT License",
],
keywords="STAC FastAPI COG",
author=u"Arturo Engineering",
author="Arturo Engineering",
author_email="[email protected]",
url="https://github.com/stac-utils/stac-fastapi",
license="MIT",
Expand Down
1 change: 0 additions & 1 deletion stac_fastapi/api/stac_fastapi/api/models.py
Original file line number Diff line number Diff line change
Expand Up @@ -160,7 +160,6 @@ class GeoJSONResponse(ORJSONResponse):

media_type = "application/geo+json"


else:
from starlette.responses import JSONResponse

Expand Down
2 changes: 1 addition & 1 deletion stac_fastapi/extensions/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -39,7 +39,7 @@
"License :: OSI Approved :: MIT License",
],
keywords="STAC FastAPI COG",
author=u"Arturo Engineering",
author="Arturo Engineering",
author_email="[email protected]",
url="https://github.com/stac-utils/stac-fastapi",
license="MIT",
Expand Down
2 changes: 1 addition & 1 deletion stac_fastapi/pgstac/stac_fastapi/pgstac/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -186,7 +186,7 @@ async def item_collection(
Called with `GET /collections/{collection_id}/items`
Args:
id: id of the collection.
collection_id: id of the collection.
limit: number of items to return.
token: pagination token.
Expand Down
50 changes: 17 additions & 33 deletions stac_fastapi/pgstac/tests/resources/test_item.py
Original file line number Diff line number Diff line change
@@ -1,18 +1,19 @@
import json
import uuid
from datetime import datetime, timedelta
from datetime import timedelta
from typing import Callable
from urllib.parse import parse_qs, urljoin, urlparse

import pystac
import pytest
from httpx import AsyncClient
from pystac.utils import datetime_to_str
from shapely.geometry import Polygon
from stac_pydantic import Collection, Item
from stac_pydantic.shared import DATETIME_RFC339
from starlette.requests import Request

from stac_fastapi.pgstac.models.links import CollectionLinks
from stac_fastapi.types.rfc3339 import rfc3339_str_to_datetime


@pytest.mark.asyncio
Expand Down Expand Up @@ -402,14 +403,14 @@ async def test_item_search_temporal_query_post(
)
assert resp.status_code == 200

item_date = datetime.strptime(test_item["properties"]["datetime"], DATETIME_RFC339)
item_date = rfc3339_str_to_datetime(test_item["properties"]["datetime"])
print(item_date)
item_date = item_date + timedelta(seconds=1)

params = {
"collections": [test_item["collection"]],
"intersects": test_item["geometry"],
"datetime": item_date.strftime(DATETIME_RFC339),
"datetime": datetime_to_str(item_date),
}

resp = await app_client.post("/search", json=params)
Expand Down Expand Up @@ -437,14 +438,15 @@ async def test_item_search_temporal_window_post(
)
assert resp.status_code == 200

item_date = datetime.strptime(test_item["properties"]["datetime"], DATETIME_RFC339)
item_date = rfc3339_str_to_datetime(test_item["properties"]["datetime"])
item_date_before = item_date - timedelta(seconds=1)
item_date_after = item_date + timedelta(seconds=1)

params = {
"collections": [test_item["collection"]],
"datetime": f"{item_date_before.strftime(DATETIME_RFC339)}/{item_date_after.strftime(DATETIME_RFC339)}",
"datetime": f"{datetime_to_str(item_date_before)}/{datetime_to_str(item_date_after)}",
}

resp = await app_client.post("/search", json=params)
resp_json = resp.json()
assert len(resp_json["features"]) == 1
Expand All @@ -455,34 +457,16 @@ async def test_item_search_temporal_window_post(
async def test_item_search_temporal_open_window(
app_client, load_test_data, load_test_collection
):
"""Test POST search with open spatio-temporal query (core)"""
test_item = load_test_data("test_item.json")
resp = await app_client.post(
f"/collections/{test_item['collection']}/items", json=test_item
)
assert resp.status_code == 200

# Add second item with a different datetime.
second_test_item = load_test_data("test_item2.json")
resp = await app_client.post(
f"/collections/{test_item['collection']}/items", json=second_test_item
)
assert resp.status_code == 200

params = {
"collections": [test_item["collection"]],
"datetime": "../..",
}
resp = await app_client.post("/search", json=params)
resp_json = resp.json()
assert len(resp_json["features"]) == 2
for dt in ["/", "../..", "../", "/.."]:
resp = await app_client.post("/search", json={"datetime": dt})
assert resp.status_code == 400


@pytest.mark.asyncio
async def test_item_search_sort_post(app_client, load_test_data, load_test_collection):
"""Test POST search with sorting (sort extension)"""
first_item = load_test_data("test_item.json")
item_date = datetime.strptime(first_item["properties"]["datetime"], DATETIME_RFC339)
item_date = rfc3339_str_to_datetime(first_item["properties"]["datetime"])
resp = await app_client.post(
f"/collections/{first_item['collection']}/items", json=first_item
)
Expand All @@ -491,7 +475,7 @@ async def test_item_search_sort_post(app_client, load_test_data, load_test_colle
second_item = load_test_data("test_item.json")
second_item["id"] = "another-item"
another_item_date = item_date - timedelta(days=1)
second_item["properties"]["datetime"] = another_item_date.strftime(DATETIME_RFC339)
second_item["properties"]["datetime"] = datetime_to_str(another_item_date)
resp = await app_client.post(
f"/collections/{second_item['collection']}/items", json=second_item
)
Expand Down Expand Up @@ -601,13 +585,13 @@ async def test_item_search_temporal_window_get(
)
assert resp.status_code == 200

item_date = datetime.strptime(test_item["properties"]["datetime"], DATETIME_RFC339)
item_date = rfc3339_str_to_datetime(test_item["properties"]["datetime"])
item_date_before = item_date - timedelta(seconds=1)
item_date_after = item_date + timedelta(seconds=1)

params = {
"collections": test_item["collection"],
"datetime": f"{item_date_before.strftime(DATETIME_RFC339)}/{item_date_after.strftime(DATETIME_RFC339)}",
"datetime": f"{datetime_to_str(item_date_before)}/{datetime_to_str(item_date_after)}",
}
resp = await app_client.get("/search", params=params)
resp_json = resp.json()
Expand All @@ -619,7 +603,7 @@ async def test_item_search_temporal_window_get(
async def test_item_search_sort_get(app_client, load_test_data, load_test_collection):
"""Test GET search with sorting (sort extension)"""
first_item = load_test_data("test_item.json")
item_date = datetime.strptime(first_item["properties"]["datetime"], DATETIME_RFC339)
item_date = rfc3339_str_to_datetime(first_item["properties"]["datetime"])
resp = await app_client.post(
f"/collections/{first_item['collection']}/items", json=first_item
)
Expand All @@ -628,7 +612,7 @@ async def test_item_search_sort_get(app_client, load_test_data, load_test_collec
second_item = load_test_data("test_item.json")
second_item["id"] = "another-item"
another_item_date = item_date - timedelta(days=1)
second_item["properties"]["datetime"] = another_item_date.strftime(DATETIME_RFC339)
second_item["properties"]["datetime"] = datetime_to_str(another_item_date)
resp = await app_client.post(
f"/collections/{second_item['collection']}/items", json=second_item
)
Expand Down
2 changes: 1 addition & 1 deletion stac_fastapi/sqlalchemy/setup.py
Original file line number Diff line number Diff line change
Expand Up @@ -47,7 +47,7 @@
"License :: OSI Approved :: MIT License",
],
keywords="STAC FastAPI COG",
author=u"Arturo Engineering",
author="Arturo Engineering",
author_email="[email protected]",
url="https://github.com/stac-utils/stac-fastapi",
license="MIT",
Expand Down
7 changes: 4 additions & 3 deletions stac_fastapi/sqlalchemy/stac_fastapi/sqlalchemy/core.py
Original file line number Diff line number Diff line change
Expand Up @@ -349,13 +349,14 @@ def post_search(
# Non-interval date ex. "2000-02-02T00:00:00.00Z"
if len(dts) == 1:
query = query.filter(self.item_table.datetime == dts[0])
elif ".." not in search_request.datetime:
# is there a benefit to between instead of >= and <= ?
elif dts[0] not in ["", ".."] and dts[1] not in ["", ".."]:
query = query.filter(self.item_table.datetime.between(*dts))
# All items after the start date
elif dts[0] != "..":
elif dts[0] not in ["", ".."]:
query = query.filter(self.item_table.datetime >= dts[0])
# All items before the end date
elif dts[1] != "..":
elif dts[1] not in ["", ".."]:
query = query.filter(self.item_table.datetime <= dts[1])

# Query fields
Expand Down
10 changes: 5 additions & 5 deletions stac_fastapi/sqlalchemy/stac_fastapi/sqlalchemy/serializers.py
Original file line number Diff line number Diff line change
@@ -1,17 +1,17 @@
"""Serializers."""
import abc
import json
from datetime import datetime
from typing import TypedDict

import attr
import geoalchemy2 as ga
from stac_pydantic.shared import DATETIME_RFC339
from pystac.utils import datetime_to_str

from stac_fastapi.sqlalchemy.models import database
from stac_fastapi.types import stac as stac_types
from stac_fastapi.types.config import Settings
from stac_fastapi.types.links import CollectionLinks, ItemLinks, resolve_links
from stac_fastapi.types.rfc3339 import now_to_rfc3339_str, rfc3339_str_to_datetime


@attr.s # type:ignore
Expand Down Expand Up @@ -55,7 +55,7 @@ def db_to_stac(cls, db_model: database.Item, base_url: str) -> stac_types.Item:
# Use getattr to accommodate extension namespaces
field_value = getattr(db_model, field.split(":")[-1])
if field == "datetime":
field_value = field_value.strftime(DATETIME_RFC339)
field_value = datetime_to_str(field_value)
properties[field] = field_value
item_id = db_model.id
collection_id = db_model.collection_id
Expand Down Expand Up @@ -101,12 +101,12 @@ def stac_to_db(
# Use getattr to accommodate extension namespaces
field_value = stac_data["properties"][field]
if field == "datetime":
field_value = datetime.strptime(field_value, DATETIME_RFC339)
field_value = rfc3339_str_to_datetime(field_value)
indexed_fields[field.split(":")[-1]] = field_value

# TODO: Exclude indexed fields from the properties jsonb field to prevent duplication

now = datetime.utcnow().strftime(DATETIME_RFC339)
now = now_to_rfc3339_str()
if "created" not in stac_data["properties"]:
stac_data["properties"]["created"] = now
stac_data["properties"]["updated"] = now
Expand Down
Loading

0 comments on commit 99ce774

Please sign in to comment.