From df84708ed7567e0d69ca01e526cebf503585aa31 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 14:09:50 +0000
Subject: [PATCH 01/25] Update sphinx

---
 docs/docs-requirements.txt | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/docs-requirements.txt b/docs/docs-requirements.txt
index a448f7f7d..21fbfcae0 100644
--- a/docs/docs-requirements.txt
+++ b/docs/docs-requirements.txt
@@ -1,5 +1,5 @@
 setuptools==68.1.2
-Sphinx==4.4.0
+Sphinx==6.1.0
 sphinx-material==0.0.35
 nbsphinx==0.8.8
 ipython>=8.10.1

From 444d68ad0d2e71b7c57ba3545a36435eb2b97369 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 14:15:54 +0000
Subject: [PATCH 02/25] Update sphinx

---
 docs/docs-requirements.txt | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/docs-requirements.txt b/docs/docs-requirements.txt
index 21fbfcae0..81f6f010f 100644
--- a/docs/docs-requirements.txt
+++ b/docs/docs-requirements.txt
@@ -1,6 +1,6 @@
 setuptools==68.1.2
 Sphinx==6.1.0
-sphinx-material==0.0.35
+sphinx-material==0.0.36
 nbsphinx==0.8.8
 ipython>=8.10.1
 sphinxcontrib-fulltoc==1.2.0

From ca81c0564eb2647719502d8662cd91a230766b1c Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 14:20:43 +0000
Subject: [PATCH 03/25] Update sphinx

---
 docs/docs-requirements.txt | 1 -
 1 file changed, 1 deletion(-)

diff --git a/docs/docs-requirements.txt b/docs/docs-requirements.txt
index 81f6f010f..f5de2239b 100644
--- a/docs/docs-requirements.txt
+++ b/docs/docs-requirements.txt
@@ -5,6 +5,5 @@ nbsphinx==0.8.8
 ipython>=8.10.1
 sphinxcontrib-fulltoc==1.2.0
 livereload==2.6.3
-autodocsumm==0.2.7
 sphinx-tabs==3.2.0
 renku-sphinx-theme==0.2.3
\ No newline at end of file

From b4606a13226dc41081e892f5239a62d71b083202 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 14:26:02 +0000
Subject: [PATCH 04/25] Update sphinx

---
 docs/docs-requirements.txt | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/docs-requirements.txt b/docs/docs-requirements.txt
index f5de2239b..8be84653e 100644
--- a/docs/docs-requirements.txt
+++ b/docs/docs-requirements.txt
@@ -5,5 +5,5 @@ nbsphinx==0.8.8
 ipython>=8.10.1
 sphinxcontrib-fulltoc==1.2.0
 livereload==2.6.3
-sphinx-tabs==3.2.0
+sphinx-tabs==3.4.4
 renku-sphinx-theme==0.2.3
\ No newline at end of file

From bdafaa5a4e3bd4a5351e60dffd3656d4eef2a678 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 14:29:18 +0000
Subject: [PATCH 05/25] Update sphinx

---
 docs/docs-requirements.txt | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/docs/docs-requirements.txt b/docs/docs-requirements.txt
index 8be84653e..1fd60955e 100644
--- a/docs/docs-requirements.txt
+++ b/docs/docs-requirements.txt
@@ -6,4 +6,4 @@ ipython>=8.10.1
 sphinxcontrib-fulltoc==1.2.0
 livereload==2.6.3
 sphinx-tabs==3.4.4
-renku-sphinx-theme==0.2.3
\ No newline at end of file
+renku-sphinx-theme==0.3.0
\ No newline at end of file

From 0ab664694e459a005ef347b6f4245f3698701d97 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 14:50:54 +0000
Subject: [PATCH 06/25] Update sphinx

---
 docs/docs-requirements.txt                |  4 +-
 docs/source/api/raster-format-readers.rst | 67 +++++++++---------
 docs/source/api/vector-format-readers.rst | 85 ++++++++++-------------
 3 files changed, 73 insertions(+), 83 deletions(-)

diff --git a/docs/docs-requirements.txt b/docs/docs-requirements.txt
index 1fd60955e..969601087 100644
--- a/docs/docs-requirements.txt
+++ b/docs/docs-requirements.txt
@@ -1,7 +1,7 @@
 setuptools==68.1.2
-Sphinx==6.1.0
+Sphinx==6.1.3
 sphinx-material==0.0.36
-nbsphinx==0.8.8
+nbsphinx>=0.8.8
 ipython>=8.10.1
 sphinxcontrib-fulltoc==1.2.0
 livereload==2.6.3
diff --git a/docs/source/api/raster-format-readers.rst b/docs/source/api/raster-format-readers.rst
index dabcc821e..d41277fd7 100644
--- a/docs/source/api/raster-format-readers.rst
+++ b/docs/source/api/raster-format-readers.rst
@@ -4,22 +4,23 @@ Raster Format Readers
 
 
 Intro
-################
+#####
 Mosaic provides spark readers for the following raster formats:
-    * GTiff (GeoTiff) using .tif file extension - https://gdal.org/drivers/raster/gtiff.html
-    * COG (Cloud Optimized GeoTiff) using .tif file extension - https://gdal.org/drivers/raster/cog.html
-    * HDF4 using .hdf file extension - https://gdal.org/drivers/raster/hdf4.html
-    * HDF5 using .h5 file extension - https://gdal.org/drivers/raster/hdf5.html
-    * NetCDF using .nc file extension - https://gdal.org/drivers/raster/netcdf.html
-    * JP2ECW using .jp2 file extension - https://gdal.org/drivers/raster/jp2ecw.html
-    * JP2KAK using .jp2 file extension - https://gdal.org/drivers/raster/jp2kak.html
-    * JP2OpenJPEG using .jp2 file extension - https://gdal.org/drivers/raster/jp2openjpeg.html
-    * PDF using .pdf file extension - https://gdal.org/drivers/raster/pdf.html
-    * PNG using .png file extension - https://gdal.org/drivers/raster/png.html
-    * VRT using .vrt file extension - https://gdal.org/drivers/raster/vrt.html
-    * XPM using .xpm file extension - https://gdal.org/drivers/raster/xpm.html
-    * GRIB using .grb file extension - https://gdal.org/drivers/raster/grib.html
-    * Zarr using .zarr file extension - https://gdal.org/drivers/raster/zarr.html
+* GTiff (GeoTiff) using .tif file extension - https://gdal.org/drivers/raster/gtiff.html
+* COG (Cloud Optimized GeoTiff) using .tif file extension - https://gdal.org/drivers/raster/cog.html
+* HDF4 using .hdf file extension - https://gdal.org/drivers/raster/hdf4.html
+* HDF5 using .h5 file extension - https://gdal.org/drivers/raster/hdf5.html
+* NetCDF using .nc file extension - https://gdal.org/drivers/raster/netcdf.html
+* JP2ECW using .jp2 file extension - https://gdal.org/drivers/raster/jp2ecw.html
+* JP2KAK using .jp2 file extension - https://gdal.org/drivers/raster/jp2kak.html
+* JP2OpenJPEG using .jp2 file extension - https://gdal.org/drivers/raster/jp2openjpeg.html
+* PDF using .pdf file extension - https://gdal.org/drivers/raster/pdf.html
+* PNG using .png file extension - https://gdal.org/drivers/raster/png.html
+* VRT using .vrt file extension - https://gdal.org/drivers/raster/vrt.html
+* XPM using .xpm file extension - https://gdal.org/drivers/raster/xpm.html
+* GRIB using .grb file extension - https://gdal.org/drivers/raster/grib.html
+* Zarr using .zarr file extension - https://gdal.org/drivers/raster/zarr.html
+
 Other formats are supported if supported by GDAL available drivers.
 
 Mosaic provides two flavors of the readers:
@@ -32,14 +33,14 @@ spark.read.format("gdal")
 A base Spark SQL data source for reading GDAL raster data sources.
 It reads metadata of the raster and exposes the direct paths for the raster files.
 The output of the reader is a DataFrame with the following columns:
-    * tile - loaded raster tile (RasterTileType)
-    * ySize - height of the raster in pixels (IntegerType)
-    * xSize - width of the raster in pixels (IntegerType)
-    * bandCount - number of bands in the raster (IntegerType)
-    * metadata - raster metadata (MapType(StringType, StringType))
-    * subdatasets - raster subdatasets (MapType(StringType, StringType))
-    * srid - raster spatial reference system identifier (IntegerType)
-    * proj4Str - raster spatial reference system proj4 string (StringType)
+* tile - loaded raster tile (RasterTileType)
+* ySize - height of the raster in pixels (IntegerType)
+* xSize - width of the raster in pixels (IntegerType)
+* bandCount - number of bands in the raster (IntegerType)
+* metadata - raster metadata (MapType(StringType, StringType))
+* subdatasets - raster subdatasets (MapType(StringType, StringType))
+* srid - raster spatial reference system identifier (IntegerType)
+* proj4Str - raster spatial reference system proj4 string (StringType)
 
 .. function:: spark.read.format("gdal").load(path)
 
@@ -94,16 +95,16 @@ If the raster pixels are larger than the grid cells, the cell values can be calc
 The interpolation method used is Inverse Distance Weighting (IDW) where the distance function is a k_ring
 distance of the grid.
 The reader supports the following options:
-    * fileExtension - file extension of the raster file (StringType) - default is *.*
-    * vsizip - if the rasters are zipped files, set this to true (BooleanType)
-    * resolution - resolution of the output grid (IntegerType)
-    * combiner - combiner operation to use when converting raster to grid (StringType) - default is mean
-    * retile - if the rasters are too large they can be re-tiled to smaller tiles (BooleanType)
-    * tileSize - size of the re-tiled tiles, tiles are always squares of tileSize x tileSize (IntegerType)
-    * readSubdatasets - if the raster has subdatasets set this to true (BooleanType)
-    * subdatasetNumber - if the raster has subdatasets, select a specific subdataset by index (IntegerType)
-    * subdatasetName - if the raster has subdatasets, select a specific subdataset by name (StringType)
-    * kRingInterpolate - if the raster pixels are larger than the grid cells, use k_ring interpolation with n = kRingInterpolate (IntegerType)
+* fileExtension - file extension of the raster file (StringType) - default is *.*
+* vsizip - if the rasters are zipped files, set this to true (BooleanType)
+* resolution - resolution of the output grid (IntegerType)
+* combiner - combiner operation to use when converting raster to grid (StringType) - default is mean
+* retile - if the rasters are too large they can be re-tiled to smaller tiles (BooleanType)
+* tileSize - size of the re-tiled tiles, tiles are always squares of tileSize x tileSize (IntegerType)
+* readSubdatasets - if the raster has subdatasets set this to true (BooleanType)
+* subdatasetNumber - if the raster has subdatasets, select a specific subdataset by index (IntegerType)
+* subdatasetName - if the raster has subdatasets, select a specific subdataset by name (StringType)
+* kRingInterpolate - if the raster pixels are larger than the grid cells, use k_ring interpolation with n = kRingInterpolate (IntegerType)
 
 .. function:: mos.read().format("raster_to_grid").load(path)
 
diff --git a/docs/source/api/vector-format-readers.rst b/docs/source/api/vector-format-readers.rst
index 8825803d5..f47c86bb5 100644
--- a/docs/source/api/vector-format-readers.rst
+++ b/docs/source/api/vector-format-readers.rst
@@ -8,36 +8,25 @@ Intro
 Mosaic provides spark readers for vector files supported by GDAL OGR drivers.
 Only the drivers that are built by default are supported.
 Here are some common useful file formats:
-    * GeoJSON (also ESRIJSON, TopoJSON)
-      https://gdal.org/drivers/vector/geojson.html
-    * ESRI File Geodatabase (FileGDB) and ESRI File Geodatabase vector (OpenFileGDB)
-      Mosaic implements named reader geo_db (described in this doc)
-      https://gdal.org/drivers/vector/filegdb.html
-    * ESRI Shapefile / DBF (ESRI Shapefile) - Mosaic implements named reader shapefile (described in this doc)
-      https://gdal.org/drivers/vector/shapefile.html
-    * Network Common Data Form (netCDF) - Mosaic implements raster reader also
-      https://gdal.org/drivers/raster/netcdf.html
-    * (Geo)Parquet (Parquet) - Mosaic will be implementing a custom reader soon
-      https://gdal.org/drivers/vector/parquet.html
-    * Spreadsheets (XLSX, XLS, ODS)
-      https://gdal.org/drivers/vector/xls.html
-    * U.S. Census TIGER/Line (TIGER)
-      https://gdal.org/drivers/vector/tiger.html
-    * PostgreSQL Dump (PGDump)
-      https://gdal.org/drivers/vector/pgdump.html
-    * Keyhole Markup Language (KML)
-      https://gdal.org/drivers/vector/kml.html
-    * Geography Markup Language (GML)
-      https://gdal.org/drivers/vector/gml.html
-    * GRASS - option for Linear Referencing Systems (LRS)
-      https://gdal.org/drivers/vector/grass.html
+* GeoJSON (also ESRIJSON, TopoJSON) https://gdal.org/drivers/vector/geojson.html
+* ESRI File Geodatabase (FileGDB) and ESRI File Geodatabase vector (OpenFileGDB). Mosaic implements named reader geo_db (described in this doc). https://gdal.org/drivers/vector/filegdb.html
+* ESRI Shapefile / DBF (ESRI Shapefile) - Mosaic implements named reader shapefile (described in this doc) https://gdal.org/drivers/vector/shapefile.html
+* Network Common Data Form (netCDF) - Mosaic implements raster reader also https://gdal.org/drivers/raster/netcdf.html
+* (Geo)Parquet (Parquet) - Mosaic will be implementing a custom reader soon https://gdal.org/drivers/vector/parquet.html
+* Spreadsheets (XLSX, XLS, ODS) https://gdal.org/drivers/vector/xls.html
+* U.S. Census TIGER/Line (TIGER) https://gdal.org/drivers/vector/tiger.html
+* PostgreSQL Dump (PGDump) https://gdal.org/drivers/vector/pgdump.html
+* Keyhole Markup Language (KML) https://gdal.org/drivers/vector/kml.html
+* Geography Markup Language (GML) https://gdal.org/drivers/vector/gml.html
+* GRASS - option for Linear Referencing Systems (LRS) https://gdal.org/drivers/vector/grass.html
+
 For more information please refer to gdal documentation: https://gdal.org/drivers/vector/index.html
 
 
 
 Mosaic provides two flavors of the readers:
-    * spark.read.format("ogr") for reading 1 file per spark task
-    * mos.read().format("multi_read_ogr") for reading file in parallel with multiple spark tasks
+* spark.read.format("ogr") for reading 1 file per spark task
+* mos.read().format("multi_read_ogr") for reading file in parallel with multiple spark tasks
 
 
 spark.read.format("ogr")
@@ -46,17 +35,17 @@ A base Spark SQL data source for reading GDAL vector data sources.
 The output of the reader is a DataFrame with inferred schema.
 The schema is inferred from both features and fields in the vector file.
 Each feature will be provided as 2 columns:
-    * geometry - geometry of the feature (GeometryType)
-    * srid - spatial reference system identifier of the feature (StringType)
+* geometry - geometry of the feature (GeometryType)
+* srid - spatial reference system identifier of the feature (StringType)
 
 The fields of the feature will be provided as columns in the DataFrame.
 The types of the fields are coerced to most concrete type that can hold all the values.
 The reader supports the following options:
-    * driverName - GDAL driver name (StringType)
-    * vsizip - if the vector files are zipped files, set this to true (BooleanType)
-    * asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
-    * layerName - name of the layer to read (StringType)
-    * layerNumber - number of the layer to read (IntegerType)
+* driverName - GDAL driver name (StringType)
+* vsizip - if the vector files are zipped files, set this to true (BooleanType)
+* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
+* layerName - name of the layer to read (StringType)
+* layerNumber - number of the layer to read (IntegerType)
 
 
 .. function:: read.format("ogr").load(path)
@@ -109,18 +98,18 @@ Chunk size is the number of file rows that will be read per single task.
 The output of the reader is a DataFrame with inferred schema.
 The schema is inferred from both features and fields in the vector file.
 Each feature will be provided as 2 columns:
-    * geometry - geometry of the feature (GeometryType)
-    * srid - spatial reference system identifier of the feature (StringType)
+* geometry - geometry of the feature (GeometryType)
+* srid - spatial reference system identifier of the feature (StringType)
 
 The fields of the feature will be provided as columns in the DataFrame.
 The types of the fields are coerced to most concrete type that can hold all the values.
 The reader supports the following options:
-    * driverName - GDAL driver name (StringType)
-    * vsizip - if the vector files are zipped files, set this to true (BooleanType)
-    * asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
-    * chunkSize - size of the chunk to read from the file per single task (IntegerType) - default is 5000
-    * layerName - name of the layer to read (StringType)
-    * layerNumber - number of the layer to read (IntegerType)
+* driverName - GDAL driver name (StringType)
+* vsizip - if the vector files are zipped files, set this to true (BooleanType)
+* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
+* chunkSize - size of the chunk to read from the file per single task (IntegerType) - default is 5000
+* layerName - name of the layer to read (StringType)
+* layerNumber - number of the layer to read (IntegerType)
 
 
 .. function:: read.format("multi_read_ogr").load(path)
@@ -171,10 +160,10 @@ Mosaic provides a reader for GeoDB files natively in Spark.
 The output of the reader is a DataFrame with inferred schema.
 Only 1 file per task is read. For parallel reading of large files use the multi_read_ogr reader.
 The reader supports the following options:
-    * asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
-    * layerName - name of the layer to read (StringType)
-    * layerNumber - number of the layer to read (IntegerType)
-    * vsizip - if the vector files are zipped files, set this to true (BooleanType)
+* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
+* layerName - name of the layer to read (StringType)
+* layerNumber - number of the layer to read (IntegerType)
+* vsizip - if the vector files are zipped files, set this to true (BooleanType)
 
 .. function:: read.format("geo_db").load(path)
 
@@ -223,10 +212,10 @@ Mosaic provides a reader for Shapefiles natively in Spark.
 The output of the reader is a DataFrame with inferred schema.
 Only 1 file per task is read. For parallel reading of large files use the multi_read_ogr reader.
 The reader supports the following options:
-    * asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
-    * layerName - name of the layer to read (StringType)
-    * layerNumber - number of the layer to read (IntegerType)
-    * vsizip - if the vector files are zipped files, set this to true (BooleanType)
+* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
+* layerName - name of the layer to read (StringType)
+* layerNumber - number of the layer to read (IntegerType)
+* vsizip - if the vector files are zipped files, set this to true (BooleanType)
 
 .. function:: read.format("shapefile").load(path)
 

From 540bca928bf5827f989926f26ed949f1fde9e496 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 14:59:46 +0000
Subject: [PATCH 07/25] Update sphinx

---
 docs/source/api/raster-format-readers.rst | 67 ++++++++++++-----------
 docs/source/api/vector-format-readers.rst | 65 ++++++++++++----------
 2 files changed, 70 insertions(+), 62 deletions(-)

diff --git a/docs/source/api/raster-format-readers.rst b/docs/source/api/raster-format-readers.rst
index d41277fd7..3e0c6443e 100644
--- a/docs/source/api/raster-format-readers.rst
+++ b/docs/source/api/raster-format-readers.rst
@@ -6,20 +6,21 @@ Raster Format Readers
 Intro
 #####
 Mosaic provides spark readers for the following raster formats:
-* GTiff (GeoTiff) using .tif file extension - https://gdal.org/drivers/raster/gtiff.html
-* COG (Cloud Optimized GeoTiff) using .tif file extension - https://gdal.org/drivers/raster/cog.html
-* HDF4 using .hdf file extension - https://gdal.org/drivers/raster/hdf4.html
-* HDF5 using .h5 file extension - https://gdal.org/drivers/raster/hdf5.html
-* NetCDF using .nc file extension - https://gdal.org/drivers/raster/netcdf.html
-* JP2ECW using .jp2 file extension - https://gdal.org/drivers/raster/jp2ecw.html
-* JP2KAK using .jp2 file extension - https://gdal.org/drivers/raster/jp2kak.html
-* JP2OpenJPEG using .jp2 file extension - https://gdal.org/drivers/raster/jp2openjpeg.html
-* PDF using .pdf file extension - https://gdal.org/drivers/raster/pdf.html
-* PNG using .png file extension - https://gdal.org/drivers/raster/png.html
-* VRT using .vrt file extension - https://gdal.org/drivers/raster/vrt.html
-* XPM using .xpm file extension - https://gdal.org/drivers/raster/xpm.html
-* GRIB using .grb file extension - https://gdal.org/drivers/raster/grib.html
-* Zarr using .zarr file extension - https://gdal.org/drivers/raster/zarr.html
+
+    * GTiff (GeoTiff) using .tif file extension - https://gdal.org/drivers/raster/gtiff.html
+    * COG (Cloud Optimized GeoTiff) using .tif file extension - https://gdal.org/drivers/raster/cog.html
+    * HDF4 using .hdf file extension - https://gdal.org/drivers/raster/hdf4.html
+    * HDF5 using .h5 file extension - https://gdal.org/drivers/raster/hdf5.html
+    * NetCDF using .nc file extension - https://gdal.org/drivers/raster/netcdf.html
+    * JP2ECW using .jp2 file extension - https://gdal.org/drivers/raster/jp2ecw.html
+    * JP2KAK using .jp2 file extension - https://gdal.org/drivers/raster/jp2kak.html
+    * JP2OpenJPEG using .jp2 file extension - https://gdal.org/drivers/raster/jp2openjpeg.html
+    * PDF using .pdf file extension - https://gdal.org/drivers/raster/pdf.html
+    * PNG using .png file extension - https://gdal.org/drivers/raster/png.html
+    * VRT using .vrt file extension - https://gdal.org/drivers/raster/vrt.html
+    * XPM using .xpm file extension - https://gdal.org/drivers/raster/xpm.html
+    * GRIB using .grb file extension - https://gdal.org/drivers/raster/grib.html
+    * Zarr using .zarr file extension - https://gdal.org/drivers/raster/zarr.html
 
 Other formats are supported if supported by GDAL available drivers.
 
@@ -33,14 +34,15 @@ spark.read.format("gdal")
 A base Spark SQL data source for reading GDAL raster data sources.
 It reads metadata of the raster and exposes the direct paths for the raster files.
 The output of the reader is a DataFrame with the following columns:
-* tile - loaded raster tile (RasterTileType)
-* ySize - height of the raster in pixels (IntegerType)
-* xSize - width of the raster in pixels (IntegerType)
-* bandCount - number of bands in the raster (IntegerType)
-* metadata - raster metadata (MapType(StringType, StringType))
-* subdatasets - raster subdatasets (MapType(StringType, StringType))
-* srid - raster spatial reference system identifier (IntegerType)
-* proj4Str - raster spatial reference system proj4 string (StringType)
+
+    * tile - loaded raster tile (RasterTileType)
+    * ySize - height of the raster in pixels (IntegerType)
+    * xSize - width of the raster in pixels (IntegerType)
+    * bandCount - number of bands in the raster (IntegerType)
+    * metadata - raster metadata (MapType(StringType, StringType))
+    * subdatasets - raster subdatasets (MapType(StringType, StringType))
+    * srid - raster spatial reference system identifier (IntegerType)
+    * proj4Str - raster spatial reference system proj4 string (StringType)
 
 .. function:: spark.read.format("gdal").load(path)
 
@@ -95,16 +97,17 @@ If the raster pixels are larger than the grid cells, the cell values can be calc
 The interpolation method used is Inverse Distance Weighting (IDW) where the distance function is a k_ring
 distance of the grid.
 The reader supports the following options:
-* fileExtension - file extension of the raster file (StringType) - default is *.*
-* vsizip - if the rasters are zipped files, set this to true (BooleanType)
-* resolution - resolution of the output grid (IntegerType)
-* combiner - combiner operation to use when converting raster to grid (StringType) - default is mean
-* retile - if the rasters are too large they can be re-tiled to smaller tiles (BooleanType)
-* tileSize - size of the re-tiled tiles, tiles are always squares of tileSize x tileSize (IntegerType)
-* readSubdatasets - if the raster has subdatasets set this to true (BooleanType)
-* subdatasetNumber - if the raster has subdatasets, select a specific subdataset by index (IntegerType)
-* subdatasetName - if the raster has subdatasets, select a specific subdataset by name (StringType)
-* kRingInterpolate - if the raster pixels are larger than the grid cells, use k_ring interpolation with n = kRingInterpolate (IntegerType)
+
+    * fileExtension - file extension of the raster file (StringType) - default is *.*
+    * vsizip - if the rasters are zipped files, set this to true (BooleanType)
+    * resolution - resolution of the output grid (IntegerType)
+    * combiner - combiner operation to use when converting raster to grid (StringType) - default is mean
+    * retile - if the rasters are too large they can be re-tiled to smaller tiles (BooleanType)
+    * tileSize - size of the re-tiled tiles, tiles are always squares of tileSize x tileSize (IntegerType)
+    * readSubdatasets - if the raster has subdatasets set this to true (BooleanType)
+    * subdatasetNumber - if the raster has subdatasets, select a specific subdataset by index (IntegerType)
+    * subdatasetName - if the raster has subdatasets, select a specific subdataset by name (StringType)
+    * kRingInterpolate - if the raster pixels are larger than the grid cells, use k_ring interpolation with n = kRingInterpolate (IntegerType)
 
 .. function:: mos.read().format("raster_to_grid").load(path)
 
diff --git a/docs/source/api/vector-format-readers.rst b/docs/source/api/vector-format-readers.rst
index f47c86bb5..8d9b420e2 100644
--- a/docs/source/api/vector-format-readers.rst
+++ b/docs/source/api/vector-format-readers.rst
@@ -8,17 +8,18 @@ Intro
 Mosaic provides spark readers for vector files supported by GDAL OGR drivers.
 Only the drivers that are built by default are supported.
 Here are some common useful file formats:
-* GeoJSON (also ESRIJSON, TopoJSON) https://gdal.org/drivers/vector/geojson.html
-* ESRI File Geodatabase (FileGDB) and ESRI File Geodatabase vector (OpenFileGDB). Mosaic implements named reader geo_db (described in this doc). https://gdal.org/drivers/vector/filegdb.html
-* ESRI Shapefile / DBF (ESRI Shapefile) - Mosaic implements named reader shapefile (described in this doc) https://gdal.org/drivers/vector/shapefile.html
-* Network Common Data Form (netCDF) - Mosaic implements raster reader also https://gdal.org/drivers/raster/netcdf.html
-* (Geo)Parquet (Parquet) - Mosaic will be implementing a custom reader soon https://gdal.org/drivers/vector/parquet.html
-* Spreadsheets (XLSX, XLS, ODS) https://gdal.org/drivers/vector/xls.html
-* U.S. Census TIGER/Line (TIGER) https://gdal.org/drivers/vector/tiger.html
-* PostgreSQL Dump (PGDump) https://gdal.org/drivers/vector/pgdump.html
-* Keyhole Markup Language (KML) https://gdal.org/drivers/vector/kml.html
-* Geography Markup Language (GML) https://gdal.org/drivers/vector/gml.html
-* GRASS - option for Linear Referencing Systems (LRS) https://gdal.org/drivers/vector/grass.html
+
+    * GeoJSON (also ESRIJSON, TopoJSON) https://gdal.org/drivers/vector/geojson.html
+    * ESRI File Geodatabase (FileGDB) and ESRI File Geodatabase vector (OpenFileGDB). Mosaic implements named reader geo_db (described in this doc). https://gdal.org/drivers/vector/filegdb.html
+    * ESRI Shapefile / DBF (ESRI Shapefile) - Mosaic implements named reader shapefile (described in this doc) https://gdal.org/drivers/vector/shapefile.html
+    * Network Common Data Form (netCDF) - Mosaic implements raster reader also https://gdal.org/drivers/raster/netcdf.html
+    * (Geo)Parquet (Parquet) - Mosaic will be implementing a custom reader soon https://gdal.org/drivers/vector/parquet.html
+    * Spreadsheets (XLSX, XLS, ODS) https://gdal.org/drivers/vector/xls.html
+    * U.S. Census TIGER/Line (TIGER) https://gdal.org/drivers/vector/tiger.html
+    * PostgreSQL Dump (PGDump) https://gdal.org/drivers/vector/pgdump.html
+    * Keyhole Markup Language (KML) https://gdal.org/drivers/vector/kml.html
+    * Geography Markup Language (GML) https://gdal.org/drivers/vector/gml.html
+    * GRASS - option for Linear Referencing Systems (LRS) https://gdal.org/drivers/vector/grass.html
 
 For more information please refer to gdal documentation: https://gdal.org/drivers/vector/index.html
 
@@ -41,11 +42,12 @@ Each feature will be provided as 2 columns:
 The fields of the feature will be provided as columns in the DataFrame.
 The types of the fields are coerced to most concrete type that can hold all the values.
 The reader supports the following options:
-* driverName - GDAL driver name (StringType)
-* vsizip - if the vector files are zipped files, set this to true (BooleanType)
-* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
-* layerName - name of the layer to read (StringType)
-* layerNumber - number of the layer to read (IntegerType)
+
+    * driverName - GDAL driver name (StringType)
+    * vsizip - if the vector files are zipped files, set this to true (BooleanType)
+    * asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
+    * layerName - name of the layer to read (StringType)
+    * layerNumber - number of the layer to read (IntegerType)
 
 
 .. function:: read.format("ogr").load(path)
@@ -104,12 +106,13 @@ Each feature will be provided as 2 columns:
 The fields of the feature will be provided as columns in the DataFrame.
 The types of the fields are coerced to most concrete type that can hold all the values.
 The reader supports the following options:
-* driverName - GDAL driver name (StringType)
-* vsizip - if the vector files are zipped files, set this to true (BooleanType)
-* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
-* chunkSize - size of the chunk to read from the file per single task (IntegerType) - default is 5000
-* layerName - name of the layer to read (StringType)
-* layerNumber - number of the layer to read (IntegerType)
+
+    * driverName - GDAL driver name (StringType)
+    * vsizip - if the vector files are zipped files, set this to true (BooleanType)
+    * asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
+    * chunkSize - size of the chunk to read from the file per single task (IntegerType) - default is 5000
+    * layerName - name of the layer to read (StringType)
+    * layerNumber - number of the layer to read (IntegerType)
 
 
 .. function:: read.format("multi_read_ogr").load(path)
@@ -160,10 +163,11 @@ Mosaic provides a reader for GeoDB files natively in Spark.
 The output of the reader is a DataFrame with inferred schema.
 Only 1 file per task is read. For parallel reading of large files use the multi_read_ogr reader.
 The reader supports the following options:
-* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
-* layerName - name of the layer to read (StringType)
-* layerNumber - number of the layer to read (IntegerType)
-* vsizip - if the vector files are zipped files, set this to true (BooleanType)
+
+    * asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
+    * layerName - name of the layer to read (StringType)
+    * layerNumber - number of the layer to read (IntegerType)
+    * vsizip - if the vector files are zipped files, set this to true (BooleanType)
 
 .. function:: read.format("geo_db").load(path)
 
@@ -212,10 +216,11 @@ Mosaic provides a reader for Shapefiles natively in Spark.
 The output of the reader is a DataFrame with inferred schema.
 Only 1 file per task is read. For parallel reading of large files use the multi_read_ogr reader.
 The reader supports the following options:
-* asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
-* layerName - name of the layer to read (StringType)
-* layerNumber - number of the layer to read (IntegerType)
-* vsizip - if the vector files are zipped files, set this to true (BooleanType)
+
+    * asWKB - if the geometry should be returned as WKB (BooleanType) - default is false
+    * layerName - name of the layer to read (StringType)
+    * layerNumber - number of the layer to read (IntegerType)
+    * vsizip - if the vector files are zipped files, set this to true (BooleanType)
 
 .. function:: read.format("shapefile").load(path)
 

From 7bcadafa849f1b3bcc9588f44643e430d371f28d Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 15:14:43 +0000
Subject: [PATCH 08/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 6 ++++++
 1 file changed, 6 insertions(+)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index a7ca1a226..a073cd8ba 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -31,3 +31,9 @@ jobs:
       with:
         github_token: ${{ secrets.GITHUB_TOKEN }}
         branch: gh-pages
+    - name: checkout v0.3.x archive
+      run: |
+        mkdir ./v0.3.x
+        cd ./v0.3.x
+        git clone -b gh-pages-v0.3.x --single-branch git@github.com:databrickslabs/mosaic.git
+

From 0fe63deb21645946ca4fc3dbf97b4850cf5a7806 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 15:20:23 +0000
Subject: [PATCH 09/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index a073cd8ba..ffb786b84 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -35,5 +35,5 @@ jobs:
       run: |
         mkdir ./v0.3.x
         cd ./v0.3.x
-        git clone -b gh-pages-v0.3.x --single-branch git@github.com:databrickslabs/mosaic.git
+        git clone -b gh-pages-v0.3.x --single-branch https://github.com/databrickslabs/mosaic.git
 

From 7baf6299c9f75493f786e6faea7d24ec106e98f0 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 15:30:07 +0000
Subject: [PATCH 10/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 11 ++++++-----
 1 file changed, 6 insertions(+), 5 deletions(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index ffb786b84..ef92222e4 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -26,14 +26,15 @@ jobs:
       with:
         documentation_path: docs/source
         requirements_path: docs/docs-requirements.txt
-    - name: Push changes
-      uses: ad-m/github-push-action@master
-      with:
-        github_token: ${{ secrets.GITHUB_TOKEN }}
-        branch: gh-pages
     - name: checkout v0.3.x archive
       run: |
+        PWD
         mkdir ./v0.3.x
         cd ./v0.3.x
         git clone -b gh-pages-v0.3.x --single-branch https://github.com/databrickslabs/mosaic.git
+    - name: Push changes
+      uses: ad-m/github-push-action@master
+      with:
+        github_token: ${{ secrets.GITHUB_TOKEN }}
+        branch: gh-pages
 

From e6448a290d6e9f82db640cb9f422794e0286ecd3 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 15:34:48 +0000
Subject: [PATCH 11/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index ef92222e4..129bca733 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -28,10 +28,10 @@ jobs:
         requirements_path: docs/docs-requirements.txt
     - name: checkout v0.3.x archive
       run: |
-        PWD
         mkdir ./v0.3.x
         cd ./v0.3.x
         git clone -b gh-pages-v0.3.x --single-branch https://github.com/databrickslabs/mosaic.git
+        ls -lah
     - name: Push changes
       uses: ad-m/github-push-action@master
       with:

From a892a377e68ff5dfa202b3c6131a6d43d4fd502d Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 15:44:22 +0000
Subject: [PATCH 12/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 5 ++++-
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index 129bca733..e17d8f489 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -31,7 +31,10 @@ jobs:
         mkdir ./v0.3.x
         cd ./v0.3.x
         git clone -b gh-pages-v0.3.x --single-branch https://github.com/databrickslabs/mosaic.git
-        ls -lah
+        git add --all
+        git commit -am "Adding v0.3.x docs"
+        cd ../
+        ls -la
     - name: Push changes
       uses: ad-m/github-push-action@master
       with:

From 91bf62feed0d4f120a6626af33d6f109017860a2 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 15:50:05 +0000
Subject: [PATCH 13/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 5 ++++-
 1 file changed, 4 insertions(+), 1 deletion(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index e17d8f489..5eba6cc62 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -31,9 +31,12 @@ jobs:
         mkdir ./v0.3.x
         cd ./v0.3.x
         git clone -b gh-pages-v0.3.x --single-branch https://github.com/databrickslabs/mosaic.git
+        git checkout gh-pages-v0.3.x
+        git pull
+        git add --all
+        cd ./mosaic
         git add --all
         git commit -am "Adding v0.3.x docs"
-        cd ../
         ls -la
     - name: Push changes
       uses: ad-m/github-push-action@master

From bee2aa9adcc469c74a2451af27c491e8443da47d Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 15:55:21 +0000
Subject: [PATCH 14/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index 5eba6cc62..67f7264d6 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -30,9 +30,8 @@ jobs:
       run: |
         mkdir ./v0.3.x
         cd ./v0.3.x
-        git clone -b gh-pages-v0.3.x --single-branch https://github.com/databrickslabs/mosaic.git
+        git submodule add https://github.com/databrickslabs/mosaic.git v0.3.x/mosaic
         git checkout gh-pages-v0.3.x
-        git pull
         git add --all
         cd ./mosaic
         git add --all

From 13f99e9da14a68969bea13c9ad818c65b833b553 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 15:58:56 +0000
Subject: [PATCH 15/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 2 +-
 1 file changed, 1 insertion(+), 1 deletion(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index 67f7264d6..15d193a29 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -33,7 +33,7 @@ jobs:
         git submodule add https://github.com/databrickslabs/mosaic.git v0.3.x/mosaic
         git checkout gh-pages-v0.3.x
         git add --all
-        cd ./mosaic
+        ls -la
         git add --all
         git commit -am "Adding v0.3.x docs"
         ls -la

From 72a506aa7eb1518819356bbdede57140f0587d77 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 16:00:09 +0000
Subject: [PATCH 16/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 2 --
 1 file changed, 2 deletions(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index 15d193a29..b03bb615e 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -28,8 +28,6 @@ jobs:
         requirements_path: docs/docs-requirements.txt
     - name: checkout v0.3.x archive
       run: |
-        mkdir ./v0.3.x
-        cd ./v0.3.x
         git submodule add https://github.com/databrickslabs/mosaic.git v0.3.x/mosaic
         git checkout gh-pages-v0.3.x
         git add --all

From 95eedc03b712e109c8ef01bff779504a2caa8063 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 16:04:55 +0000
Subject: [PATCH 17/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index b03bb615e..d65022232 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -30,8 +30,7 @@ jobs:
       run: |
         git submodule add https://github.com/databrickslabs/mosaic.git v0.3.x/mosaic
         git checkout gh-pages-v0.3.x
-        git add --all
-        ls -la
+        rm -f .gitmodules
         git add --all
         git commit -am "Adding v0.3.x docs"
         ls -la

From 0dfcd958454df1d4fbfa28aa85fdac03e11c04fe Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 16:22:43 +0000
Subject: [PATCH 18/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 5 ++---
 1 file changed, 2 insertions(+), 3 deletions(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index d65022232..af1f5452c 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -28,11 +28,10 @@ jobs:
         requirements_path: docs/docs-requirements.txt
     - name: checkout v0.3.x archive
       run: |
-        git submodule add https://github.com/databrickslabs/mosaic.git v0.3.x/mosaic
-        git checkout gh-pages-v0.3.x
-        rm -f .gitmodules
+        git archive gh-pages-v0.3.x | tar -x -C ./v0.3.x
         git add --all
         git commit -am "Adding v0.3.x docs"
+        cd v0.3.x
         ls -la
     - name: Push changes
       uses: ad-m/github-push-action@master

From 84ba805157caa216245faef83c8cbbba2029974a Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 16:33:21 +0000
Subject: [PATCH 19/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 3 +++
 1 file changed, 3 insertions(+)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index af1f5452c..d1c4e9a8f 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -28,11 +28,14 @@ jobs:
         requirements_path: docs/docs-requirements.txt
     - name: checkout v0.3.x archive
       run: |
+        git checkout gh-pages-v0.3.x
+        git pull
         git archive gh-pages-v0.3.x | tar -x -C ./v0.3.x
         git add --all
         git commit -am "Adding v0.3.x docs"
         cd v0.3.x
         ls -la
+        git checkout gh-pages
     - name: Push changes
       uses: ad-m/github-push-action@master
       with:

From 778ff5c652753510f9e4e62cd5c6b38858311343 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 16:36:44 +0000
Subject: [PATCH 20/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 1 +
 1 file changed, 1 insertion(+)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index d1c4e9a8f..4aa493dba 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -30,6 +30,7 @@ jobs:
       run: |
         git checkout gh-pages-v0.3.x
         git pull
+        mkdir ./v0.3.x
         git archive gh-pages-v0.3.x | tar -x -C ./v0.3.x
         git add --all
         git commit -am "Adding v0.3.x docs"

From 57c1c982e3ffc8993bad886cd37440c640763592 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 16:41:34 +0000
Subject: [PATCH 21/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index 4aa493dba..05e617ccd 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -30,13 +30,12 @@ jobs:
       run: |
         git checkout gh-pages-v0.3.x
         git pull
+        git checkout gh-pages
         mkdir ./v0.3.x
         git archive gh-pages-v0.3.x | tar -x -C ./v0.3.x
         git add --all
         git commit -am "Adding v0.3.x docs"
-        cd v0.3.x
         ls -la
-        git checkout gh-pages
     - name: Push changes
       uses: ad-m/github-push-action@master
       with:

From fa22eea6d6e515a46ca5a295d83940da744a8d20 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 16:56:19 +0000
Subject: [PATCH 22/25] Add docs archive checkout.

---
 .github/workflows/docs.yml | 11 ++++++++++-
 1 file changed, 10 insertions(+), 1 deletion(-)

diff --git a/.github/workflows/docs.yml b/.github/workflows/docs.yml
index 05e617ccd..5519ecec0 100644
--- a/.github/workflows/docs.yml
+++ b/.github/workflows/docs.yml
@@ -27,6 +27,16 @@ jobs:
         documentation_path: docs/source
         requirements_path: docs/docs-requirements.txt
     - name: checkout v0.3.x archive
+      # Please do not change any step in here, even though it may look hacky
+      # This is the only way to emulate git archive --remote with actions/checkout
+      # git checkout gh-pages-v0.3.x is required to have a local branch for archiving
+      # git pull is optional, but it's a good practice to have the latest version
+      # git checkout gh-pages right after is required to go back to the working branch
+      # mkdir ./v0.3.x is required to create a directory for the archive
+      # git archive gh-pages-v0.3.x | tar -x -C ./v0.3.x is required to extract the archive
+      # in the right place
+      # git add --all is required to add the new files to the working branch
+      # git commit -am "Adding v0.3.x docs" is required to commit the changes
       run: |
         git checkout gh-pages-v0.3.x
         git pull
@@ -35,7 +45,6 @@ jobs:
         git archive gh-pages-v0.3.x | tar -x -C ./v0.3.x
         git add --all
         git commit -am "Adding v0.3.x docs"
-        ls -la
     - name: Push changes
       uses: ad-m/github-push-action@master
       with:

From 0167e51b83828c4c93a55ceff857e8a6554806dc Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 17:06:22 +0000
Subject: [PATCH 23/25] Link archived docs.

---
 docs/source/index.rst | 1 +
 1 file changed, 1 insertion(+)

diff --git a/docs/source/index.rst b/docs/source/index.rst
index ee499822e..f0896cfd3 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -75,6 +75,7 @@ Documentation
    usage/usage
    models/models
    literature/videos
+   v0.3.x/index
 
 
 Indices and tables

From dbdb24d37557bd6755f71bafd5d270ac686d72d4 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 17:13:18 +0000
Subject: [PATCH 24/25] Link archived docs.

---
 docs/source/index.rst | 3 +--
 1 file changed, 1 insertion(+), 2 deletions(-)

diff --git a/docs/source/index.rst b/docs/source/index.rst
index f0896cfd3..7ff8cc7ab 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -60,7 +60,7 @@ Mosaic provides:
    * optimisations for performing point-in-polygon joins using an approach we co-developed with Ordnance Survey (`blog post <https://databricks.com/blog/2021/10/11/efficient-point-in-polygon-joins-via-pyspark-and-bng-geospatial-indexing.html>`_); and
    * the choice of a Scala, SQL and Python API.
 
-
+For Mosaic versions < 0.4.0 please use the `0.3.x docs <https://databrickslabs.github.io/mosaic/0.3.x/index.html>`_.
 
 Documentation
 =============
@@ -75,7 +75,6 @@ Documentation
    usage/usage
    models/models
    literature/videos
-   v0.3.x/index
 
 
 Indices and tables

From 5a4b7e415b1e9ed50266474791ba3e671de34833 Mon Sep 17 00:00:00 2001
From: "milos.colic" <milos.colic@databrikcs.com>
Date: Mon, 15 Jan 2024 22:04:35 +0000
Subject: [PATCH 25/25] Update docs index page.

---
 docs/source/index.rst | 50 ++++++++++++++++++++++++++++++++++++++++++-
 1 file changed, 49 insertions(+), 1 deletion(-)

diff --git a/docs/source/index.rst b/docs/source/index.rst
index 7ff8cc7ab..172396809 100644
--- a/docs/source/index.rst
+++ b/docs/source/index.rst
@@ -60,7 +60,55 @@ Mosaic provides:
    * optimisations for performing point-in-polygon joins using an approach we co-developed with Ordnance Survey (`blog post <https://databricks.com/blog/2021/10/11/efficient-point-in-polygon-joins-via-pyspark-and-bng-geospatial-indexing.html>`_); and
    * the choice of a Scala, SQL and Python API.
 
-For Mosaic versions < 0.4.0 please use the `0.3.x docs <https://databrickslabs.github.io/mosaic/0.3.x/index.html>`_.
+.. note::
+   For Mosaic versions < 0.4.0 please use the `0.3.x docs <https://databrickslabs.github.io/mosaic/v0.3.x/index.html>`_.
+
+
+Version 0.4.0
+=============
+
+We recommend using Databricks Runtime versions 13.3 LTS with Photon enabled.
+
+.. warning::
+   Mosaic 0.4.x series only supports DBR 13.x DBRs.
+   If running on a different DBR it will throw an exception:
+
+   **DEPRECATION ERROR: Mosaic v0.4.x series only supports Databricks Runtime 13. You can specify `%pip install 'databricks-mosaic<0.4,>=0.3'` for DBR < 13.**
+
+As of the 0.4.0 release, Mosaic issues the following ERROR when initialized on a cluster that is neither Photon Runtime nor Databricks Runtime ML `ADB <https://learn.microsoft.com/en-us/azure/databricks/runtime/>`_ | `AWS <https://docs.databricks.com/runtime/index.html/>`_ | `GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_ :
+
+**DEPRECATION ERROR: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial AI benefits; Mosaic 0.4.x series restricts executing this cluster.**
+
+As of Mosaic 0.4.0 (subject to change in follow-on releases)
+   * No Mosaic SQL expressions cannot yet be registered with `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ due to API changes affecting DBRs >= 13.
+   * `Assigned Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_ : Mosaic Python, R, and Scala APIs.
+   * `Shared Access Clusters <https://docs.databricks.com/en/compute/configure.html#access-modes>`_ : Mosaic Scala API (JVM) with Admin `allowlisting <https://docs.databricks.com/en/data-governance/unity-catalog/manage-privileges/allowlist.html>`_ ; Python bindings to Mosaic Scala APIs are blocked by Py4J Security on Shared Access Clusters.
+
+.. note::
+   As of Mosaic 0.4.0 (subject to change in follow-on releases)
+
+   * `Unity Catalog <https://www.databricks.com/product/unity-catalog>`_ : Enforces process isolation which is difficult to accomplish with custom JVM libraries; as such only built-in (aka platform provided) JVM APIs can be invoked from other supported languages in Shared Access Clusters.
+   * `Volumes <https://docs.databricks.com/en/connect/unity-catalog/volumes.html>`_ : Along the same principle of isolation, clusters (both assigned and shared access) can read Volumes via relevant built-in readers and writers or via custom python calls which do not involve any custom JVM code.
+
+
+
+Version 0.3.x Series
+====================
+
+We recommend using Databricks Runtime versions 12.2 LTS with Photon enabled.
+For Mosaic versions < 0.4.0 please use the `0.3.x docs <https://databrickslabs.github.io/mosaic/v0.3.x/index.html>`_.
+
+.. warning::
+   Mosaic 0.3.x series does not support DBR 13.x DBRs.
+
+As of the 0.3.11 release, Mosaic issues the following WARNING when initialized on a cluster that is neither Photon Runtime nor Databricks Runtime ML `ADB <https://learn.microsoft.com/en-us/azure/databricks/runtime/>`_ | `AWS <https://docs.databricks.com/runtime/index.html/>`_ | `GCP <https://docs.gcp.databricks.com/runtime/index.html/>`_ :
+
+**DEPRECATION WARNING: Please use a Databricks Photon-enabled Runtime for performance benefits or Runtime ML for spatial AI benefits; Mosaic will stop working on this cluster after v0.3.x.**
+If you are receiving this warning in v0.3.11+, you will want to begin to plan for a supported runtime. The reason we are making this change is that we are streamlining Mosaic internals to be more aligned with future product APIs which are powered by Photon. Along this direction of change, Mosaic has standardized to JTS as its default and supported Vector Geometry Provider.
+
+
+
+
 
 Documentation
 =============