Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Feature/docs fix #508

Merged
merged 25 commits into from
Jan 16, 2024
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
20 changes: 20 additions & 0 deletions .github/workflows/docs.yml
Original file line number Diff line number Diff line change
Expand Up @@ -26,8 +26,28 @@ jobs:
with:
documentation_path: docs/source
requirements_path: docs/docs-requirements.txt
- name: checkout v0.3.x archive
# Please do not change any step in here, even though it may look hacky
# This is the only way to emulate git archive --remote with actions/checkout
# git checkout gh-pages-v0.3.x is required to have a local branch for archiving
# git pull is optional, but it's a good practice to have the latest version
# git checkout gh-pages right after is required to go back to the working branch
# mkdir ./v0.3.x is required to create a directory for the archive
# git archive gh-pages-v0.3.x | tar -x -C ./v0.3.x is required to extract the archive
# in the right place
# git add --all is required to add the new files to the working branch
# git commit -am "Adding v0.3.x docs" is required to commit the changes
run: |
git checkout gh-pages-v0.3.x
git pull
git checkout gh-pages
mkdir ./v0.3.x
git archive gh-pages-v0.3.x | tar -x -C ./v0.3.x
git add --all
git commit -am "Adding v0.3.x docs"
- name: Push changes
uses: ad-m/github-push-action@master
with:
github_token: ${{ secrets.GITHUB_TOKEN }}
branch: gh-pages

11 changes: 5 additions & 6 deletions docs/docs-requirements.txt
Original file line number Diff line number Diff line change
@@ -1,10 +1,9 @@
setuptools==68.1.2
Sphinx==4.4.0
sphinx-material==0.0.35
nbsphinx==0.8.8
Sphinx==6.1.3
sphinx-material==0.0.36
nbsphinx>=0.8.8
ipython>=8.10.1
sphinxcontrib-fulltoc==1.2.0
livereload==2.6.3
autodocsumm==0.2.7
sphinx-tabs==3.2.0
renku-sphinx-theme==0.2.3
sphinx-tabs==3.4.4
renku-sphinx-theme==0.3.0
6 changes: 5 additions & 1 deletion docs/source/api/raster-format-readers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -4,8 +4,9 @@ Raster Format Readers


Intro
################
#####
Mosaic provides spark readers for the following raster formats:

* GTiff (GeoTiff) using .tif file extension - https://gdal.org/drivers/raster/gtiff.html
* COG (Cloud Optimized GeoTiff) using .tif file extension - https://gdal.org/drivers/raster/cog.html
* HDF4 using .hdf file extension - https://gdal.org/drivers/raster/hdf4.html
Expand All @@ -20,6 +21,7 @@ Mosaic provides spark readers for the following raster formats:
* XPM using .xpm file extension - https://gdal.org/drivers/raster/xpm.html
* GRIB using .grb file extension - https://gdal.org/drivers/raster/grib.html
* Zarr using .zarr file extension - https://gdal.org/drivers/raster/zarr.html

Other formats are supported if supported by GDAL available drivers.

Mosaic provides two flavors of the readers:
Expand All @@ -32,6 +34,7 @@ spark.read.format("gdal")
A base Spark SQL data source for reading GDAL raster data sources.
It reads metadata of the raster and exposes the direct paths for the raster files.
The output of the reader is a DataFrame with the following columns:

* tile - loaded raster tile (RasterTileType)
* ySize - height of the raster in pixels (IntegerType)
* xSize - width of the raster in pixels (IntegerType)
Expand Down Expand Up @@ -94,6 +97,7 @@ If the raster pixels are larger than the grid cells, the cell values can be calc
The interpolation method used is Inverse Distance Weighting (IDW) where the distance function is a k_ring
distance of the grid.
The reader supports the following options:

* fileExtension - file extension of the raster file (StringType) - default is *.*
* vsizip - if the rasters are zipped files, set this to true (BooleanType)
* resolution - resolution of the output grid (IntegerType)
Expand Down
52 changes: 23 additions & 29 deletions docs/source/api/vector-format-readers.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,36 +8,26 @@ Intro
Mosaic provides spark readers for vector files supported by GDAL OGR drivers.
Only the drivers that are built by default are supported.
Here are some common useful file formats:
* GeoJSON (also ESRIJSON, TopoJSON)
https://gdal.org/drivers/vector/geojson.html
* ESRI File Geodatabase (FileGDB) and ESRI File Geodatabase vector (OpenFileGDB)
Mosaic implements named reader geo_db (described in this doc)
https://gdal.org/drivers/vector/filegdb.html
* ESRI Shapefile / DBF (ESRI Shapefile) - Mosaic implements named reader shapefile (described in this doc)
https://gdal.org/drivers/vector/shapefile.html
* Network Common Data Form (netCDF) - Mosaic implements raster reader also
https://gdal.org/drivers/raster/netcdf.html
* (Geo)Parquet (Parquet) - Mosaic will be implementing a custom reader soon
https://gdal.org/drivers/vector/parquet.html
* Spreadsheets (XLSX, XLS, ODS)
https://gdal.org/drivers/vector/xls.html
* U.S. Census TIGER/Line (TIGER)
https://gdal.org/drivers/vector/tiger.html
* PostgreSQL Dump (PGDump)
https://gdal.org/drivers/vector/pgdump.html
* Keyhole Markup Language (KML)
https://gdal.org/drivers/vector/kml.html
* Geography Markup Language (GML)
https://gdal.org/drivers/vector/gml.html
* GRASS - option for Linear Referencing Systems (LRS)
https://gdal.org/drivers/vector/grass.html

* GeoJSON (also ESRIJSON, TopoJSON) https://gdal.org/drivers/vector/geojson.html
* ESRI File Geodatabase (FileGDB) and ESRI File Geodatabase vector (OpenFileGDB). Mosaic implements named reader geo_db (described in this doc). https://gdal.org/drivers/vector/filegdb.html
* ESRI Shapefile / DBF (ESRI Shapefile) - Mosaic implements named reader shapefile (described in this doc) https://gdal.org/drivers/vector/shapefile.html
* Network Common Data Form (netCDF) - Mosaic implements raster reader also https://gdal.org/drivers/raster/netcdf.html
* (Geo)Parquet (Parquet) - Mosaic will be implementing a custom reader soon https://gdal.org/drivers/vector/parquet.html
* Spreadsheets (XLSX, XLS, ODS) https://gdal.org/drivers/vector/xls.html
* U.S. Census TIGER/Line (TIGER) https://gdal.org/drivers/vector/tiger.html
* PostgreSQL Dump (PGDump) https://gdal.org/drivers/vector/pgdump.html
* Keyhole Markup Language (KML) https://gdal.org/drivers/vector/kml.html
* Geography Markup Language (GML) https://gdal.org/drivers/vector/gml.html
* GRASS - option for Linear Referencing Systems (LRS) https://gdal.org/drivers/vector/grass.html

For more information please refer to gdal documentation: https://gdal.org/drivers/vector/index.html



Mosaic provides two flavors of the readers:
* spark.read.format("ogr") for reading 1 file per spark task
* mos.read().format("multi_read_ogr") for reading file in parallel with multiple spark tasks
* spark.read.format("ogr") for reading 1 file per spark task
* mos.read().format("multi_read_ogr") for reading file in parallel with multiple spark tasks


spark.read.format("ogr")
Expand All @@ -46,12 +36,13 @@ A base Spark SQL data source for reading GDAL vector data sources.
The output of the reader is a DataFrame with inferred schema.
The schema is inferred from both features and fields in the vector file.
Each feature will be provided as 2 columns:
* geometry - geometry of the feature (GeometryType)
* srid - spatial reference system identifier of the feature (StringType)
* geometry - geometry of the feature (GeometryType)
* srid - spatial reference system identifier of the feature (StringType)

The fields of the feature will be provided as columns in the DataFrame.
The types of the fields are coerced to most concrete type that can hold all the values.
The reader supports the following options:

* driverName - GDAL driver name (StringType)
* vsizip - if the vector files are zipped files, set this to true (BooleanType)
* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
Expand Down Expand Up @@ -109,12 +100,13 @@ Chunk size is the number of file rows that will be read per single task.
The output of the reader is a DataFrame with inferred schema.
The schema is inferred from both features and fields in the vector file.
Each feature will be provided as 2 columns:
* geometry - geometry of the feature (GeometryType)
* srid - spatial reference system identifier of the feature (StringType)
* geometry - geometry of the feature (GeometryType)
* srid - spatial reference system identifier of the feature (StringType)

The fields of the feature will be provided as columns in the DataFrame.
The types of the fields are coerced to most concrete type that can hold all the values.
The reader supports the following options:

* driverName - GDAL driver name (StringType)
* vsizip - if the vector files are zipped files, set this to true (BooleanType)
* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
Expand Down Expand Up @@ -171,6 +163,7 @@ Mosaic provides a reader for GeoDB files natively in Spark.
The output of the reader is a DataFrame with inferred schema.
Only 1 file per task is read. For parallel reading of large files use the multi_read_ogr reader.
The reader supports the following options:

* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
* layerName - name of the layer to read (StringType)
* layerNumber - number of the layer to read (IntegerType)
Expand Down Expand Up @@ -223,6 +216,7 @@ Mosaic provides a reader for Shapefiles natively in Spark.
The output of the reader is a DataFrame with inferred schema.
Only 1 file per task is read. For parallel reading of large files use the multi_read_ogr reader.
The reader supports the following options:

* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
* layerName - name of the layer to read (StringType)
* layerNumber - number of the layer to read (IntegerType)
Expand Down
48 changes: 48 additions & 0 deletions docs/source/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -60,6 +60,54 @@ Mosaic provides:
* optimisations for performing point-in-polygon joins using an approach we co-developed with Ordnance Survey (`blog post <https://databricks.com/blog/2021/10/11/efficient-point-in-polygon-joins-via-pyspark-and-bng-geospatial-indexing.html>`_); and
* the choice of a Scala, SQL and Python API.

.. note::
For Mosaic versions < 0.4.0 please use the `0.3.x docs <https://databrickslabs.github.io/mosaic/v0.3.x/index.html>`_.


Version 0.4.0
=============

We recommend using Databricks Runtime versions 13.3 LTS with Photon enabled.

.. warning::
Mosaic 0.4.x series only supports DBR 13.x DBRs.
If running on a different DBR it will throw an exception:

**DEPRECATION ERROR: Mosaic v0.4.x series only supports Databricks Runtime 13. You can specify `%pip install 'databricks-mosaic<0.4,>=0.3'` for DBR < 13.**

As of the 0.4.0 release, Mosaic issues the following ERROR when initialized on a cluster that is neither Photon Runtime nor Databricks Runtime ML `ADB <https://learn.microsoft.com/en-us/azure/databricks/runtime/>`_ | `AWS <https://docs.databricks.com/runtime/index.html/>`_ | `GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_ :

**DEPRECATION ERROR: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial AI benefits; Mosaic 0.4.x series restricts executing this cluster.**

As of Mosaic 0.4.0 (subject to change in follow-on releases)
* No Mosaic SQL expressions cannot yet be registered with `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ due to API changes affecting DBRs >= 13.
* `Assigned Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_ : Mosaic Python, R, and Scala APIs.
* `Shared Access Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_ : Mosaic Scala API (JVM) with Admin `allowlisting <https://docs.databricks.com/en/data-governance/unity-catalog/manage-privileges/allowlist.html>`_ ; Python bindings to Mosaic Scala APIs are blocked by Py4J Security on Shared Access Clusters.

.. note::
As of Mosaic 0.4.0 (subject to change in follow-on releases)

* `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ : Enforces process isolation which is difficult to accomplish with custom JVM libraries; as such only built-in (aka platform provided) JVM APIs can be invoked from other supported languages in Shared Access Clusters.
* `Volumes <https://docs.databricks.com/en/connect/unity-catalog/volumes.html>`_ : Along the same principle of isolation, clusters (both assigned and shared access) can read Volumes via relevant built-in readers and writers or via custom python calls which do not involve any custom JVM code.



Version 0.3.x Series
====================

We recommend using Databricks Runtime versions 12.2 LTS with Photon enabled.
For Mosaic versions < 0.4.0 please use the `0.3.x docs <https://databrickslabs.github.io/mosaic/v0.3.x/index.html>`_.

.. warning::
Mosaic 0.3.x series does not support DBR 13.x DBRs.

As of the 0.3.11 release, Mosaic issues the following WARNING when initialized on a cluster that is neither Photon Runtime nor Databricks Runtime ML `ADB <https://learn.microsoft.com/en-us/azure/databricks/runtime/>`_ | `AWS <https://docs.databricks.com/runtime/index.html/>`_ | `GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_ :

**DEPRECATION WARNING: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial AI benefits; Mosaic will stop working on this cluster after v0.3.x.**
If you are receiving this warning in v0.3.11+, you will want to begin to plan for a supported runtime. The reason we are making this change is that we are streamlining Mosaic internals to be more aligned with future product APIs which are powered by Photon. Along this direction of change, Mosaic has standardized to JTS as its default and supported Vector Geometry Provider.





Documentation
Expand Down
Loading