diff --git a/.github/PULL_REQUEST_TEMPLATE/pull_request_template.md b/.github/PULL_REQUEST_TEMPLATE/pull_request_template.md
index 3983bcd0..d080e673 100644
--- a/.github/PULL_REQUEST_TEMPLATE/pull_request_template.md
+++ b/.github/PULL_REQUEST_TEMPLATE/pull_request_template.md
@@ -1,22 +1,28 @@
# PolarRoute Pull Request Template
Date:
-Version Number:
+Version Number:
## Description of change
-Fixes # (issue)
+## Fixes # (issue)
# Testing
-To ensure that the functionality of the PolarRoute codebase remains consistent throughout the development cycle a testing strategy has been developed, which can be viewed in the document `.github/PULL_REQUEST_TEMPLATE/testing_strategy.md`.
+To ensure that the functionality of the PolarRoute codebase remains consistent throughout the development cycle a testing strategy has been developed, which can be viewed in the document `test/testing_strategy.md`.
This includes a collection of test files which should be run according to which part of the codebase has been altered in a pull request. Please consult the testing strategy to determine which tests need to be run.
+- [ ] My changes have not altered any of the files listed in the testing strategy
- [ ] My changes result in all required regression tests passing without the need to update test files.
+
+> *list which files have been altered and include a pytest.txt file for each of
+> the tests required to be run*
+>
+> The files which have been changed during this PR can be listed using the command
-> *include pytest.txt file showing all tests passing.*
+ git diff --name-only 0.2.x
- [ ] My changes require one or more test files to be updated for all regression tests to pass.
diff --git a/.gitignore b/.gitignore
index 1715a4e9..a096f688 100644
--- a/.gitignore
+++ b/.gitignore
@@ -17,6 +17,7 @@ TestingNotebook.ipynb
*.output.*json
setup.cfg
pip.sh
+datastore
./datastore/*
datastore/*
configs/*
diff --git a/README.md b/README.md
index f379ffc7..b4a52ed7 100644
--- a/README.md
+++ b/README.md
@@ -8,13 +8,14 @@
# PolarRoute
-> PolarRoute is a long-distance maritime polar route planning, taking into account complex changing environmental conditions. The codebase allows the construction of optimised routes through three main stages: discrete modelling of the environmental conditions using a non-uniform mesh, the construction of mesh-optimal paths, and physics informed path smoothing. In order to account for different vehicle properties we construct a series of data driven functions that can be applied to the environmental mesh to determine the speed limitations and fuel requirements for a given vessel and mesh cell, representing these quantities graphically and geospatially.
+PolarRoute is a long-distance maritime polar route planning package, taking into account complex changing environmental conditions. The codebase allows the construction of optimised routes through three main stages: discrete modelling of the environmental conditions using a non-uniform mesh, the construction of mesh-optimal paths, and physics informed path smoothing. In order to account for different vehicle properties we construct a series of data driven functions that can be applied to the environmental mesh to determine the speed limitations and fuel requirements for a given vessel and mesh cell, representing these quantities graphically and geospatially.
## Installation
-The PolarRoute software requires GDAL files to be installed. The PolarRoute software can be installed on Windows by running the required wheels for GDAL and FIONA. MOre information can be found in the manual pages linked above. Once these requirements are met then the software can be installed by:
+The PolarRoute package requires GDAL files to be installed. This software can be installed on Windows by running the required wheels for GDAL and FIONA. More information can be found in the manual pages linked above. Once these requirements are met then the software can be installed by:
Github:
```
+git clone https://github.com/Antarctica/PolarRoute
python setup.py install
```
@@ -23,6 +24,8 @@ python setup.py install
pip install polar-route
```
+> NOTE: The installation process may vary slightly dependent on OS. Please consult the documentation for further installation guidance.
+
## Required Data sources
Polar-route has been built to work with a variety of open-source atmospheric and oceanographic data sources.
A list of supported data sources and their associated data-loaders is given in the
@@ -45,7 +48,7 @@ rm -r docs/build/.doctrees/
Jonathan Smith, Samuel Hall, George Coombs, James Byrne, Michael Thorne, Maria Fox, Harrison Abbot, Ayat Fekry
## Collaboration
-We are currently assessing the best pratice for collaboration on the codebase, until then please contact [marfox@bas.ac.uk](marfox@bas.ac.uk) or [jonsmi@bas.ac.uk](jonsmi@bas.ac.uk) for further info.
+We are currently assessing the best practice for collaboration on the codebase, until then please contact [polarroute@bas.ac.uk](polarroute@bas.ac.uk) for further info.
## License
diff --git a/docs/html/.buildinfo b/docs/html/.buildinfo
index 5b9e2d90..a6ba2ee4 100644
--- a/docs/html/.buildinfo
+++ b/docs/html/.buildinfo
@@ -1,4 +1,4 @@
# Sphinx build info version 1
# This file hashes the configuration used when building these files. When it is not found, a full rebuild will be done.
-config: f5c21da8605eb8472f39c71741ffa923
+config: ff9ebad1c8db7cc3ee17404f270eae40
tags: 645f666f9bcd5a90fca523b33c5a78b7
diff --git a/docs/html/.doctrees/environment.pickle b/docs/html/.doctrees/environment.pickle
index b03cc0f8..cd4aade3 100644
Binary files a/docs/html/.doctrees/environment.pickle and b/docs/html/.doctrees/environment.pickle differ
diff --git a/docs/html/.doctrees/index.doctree b/docs/html/.doctrees/index.doctree
index a7a0e472..b2a17c51 100644
Binary files a/docs/html/.doctrees/index.doctree and b/docs/html/.doctrees/index.doctree differ
diff --git a/docs/html/.doctrees/sections/Code_structure.doctree b/docs/html/.doctrees/sections/Code_structure.doctree
deleted file mode 100644
index cdedf60e..00000000
Binary files a/docs/html/.doctrees/sections/Code_structure.doctree and /dev/null differ
diff --git a/docs/html/.doctrees/sections/Configuration.doctree b/docs/html/.doctrees/sections/Configuration.doctree
deleted file mode 100644
index 828af476..00000000
Binary files a/docs/html/.doctrees/sections/Configuration.doctree and /dev/null differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/DataLoaderInterface.doctree b/docs/html/.doctrees/sections/Dataloaders/DataLoaderInterface.doctree
index 2e137560..490b221b 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/DataLoaderInterface.doctree and b/docs/html/.doctrees/sections/Dataloaders/DataLoaderInterface.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/Factory.doctree b/docs/html/.doctrees/sections/Dataloaders/Factory.doctree
index 4243bfed..000b6d4b 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/Factory.doctree and b/docs/html/.doctrees/sections/Dataloaders/Factory.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/overview.doctree b/docs/html/.doctrees/sections/Dataloaders/overview.doctree
index 6e25a337..5c5a794a 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/overview.doctree and b/docs/html/.doctrees/sections/Dataloaders/overview.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/abstractScalar.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/abstractScalar.doctree
index 74283710..63341765 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/abstractScalar.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/abstractScalar.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/AMSR.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/AMSR.doctree
index 5628e4b8..d50969a7 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/AMSR.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/AMSR.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BSOSEDepth.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BSOSEDepth.doctree
index 3be39256..75b44aa0 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BSOSEDepth.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BSOSEDepth.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BSOSESeaIce.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BSOSESeaIce.doctree
index 74c550e0..85527b55 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BSOSESeaIce.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BSOSESeaIce.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BalticSeaIce.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BalticSeaIce.doctree
index 2f385848..31fb6fc6 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BalticSeaIce.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BalticSeaIce.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BinaryGRF.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BinaryGRF.doctree
index db73317a..08d57dcd 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BinaryGRF.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/BinaryGRF.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Density.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Density.doctree
index b428b29c..b9720179 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Density.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Density.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/GEBCO.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/GEBCO.doctree
index 2f70b9f0..21da5669 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/GEBCO.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/GEBCO.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/IceNet.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/IceNet.doctree
index 1f1e1eda..e779d3d8 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/IceNet.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/IceNet.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/MODIS.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/MODIS.doctree
index d4e9fa11..2c061a10 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/MODIS.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/MODIS.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/ScalarCSV.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/ScalarCSV.doctree
index 289af8d2..0582a5a8 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/ScalarCSV.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/ScalarCSV.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/ScalarGRF.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/ScalarGRF.doctree
index db2fbcfd..6829d835 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/ScalarGRF.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/ScalarGRF.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Shape.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Shape.doctree
index 8a540b01..e6e4070a 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Shape.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Shape.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Thickness.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Thickness.doctree
index a11e43b6..b13d43de 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Thickness.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/implemented/Thickness.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/scalar/index.doctree b/docs/html/.doctrees/sections/Dataloaders/scalar/index.doctree
index 671d89e9..82479920 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/scalar/index.doctree and b/docs/html/.doctrees/sections/Dataloaders/scalar/index.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/vector/abstractVector.doctree b/docs/html/.doctrees/sections/Dataloaders/vector/abstractVector.doctree
index 686fb53e..b9d22517 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/vector/abstractVector.doctree and b/docs/html/.doctrees/sections/Dataloaders/vector/abstractVector.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/BalticCurrent.doctree b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/BalticCurrent.doctree
index d8830c77..084e4688 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/BalticCurrent.doctree and b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/BalticCurrent.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/ERA5Wind.doctree b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/ERA5Wind.doctree
index 970944a1..057fc54a 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/ERA5Wind.doctree and b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/ERA5Wind.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/NorthSeaCurrent.doctree b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/NorthSeaCurrent.doctree
index 557ff2a4..9727b507 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/NorthSeaCurrent.doctree and b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/NorthSeaCurrent.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/ORAS5Current.doctree b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/ORAS5Current.doctree
index f4334627..fab3c022 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/ORAS5Current.doctree and b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/ORAS5Current.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/SOSE.doctree b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/SOSE.doctree
index 47087181..5147a26b 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/SOSE.doctree and b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/SOSE.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/VectorCSV.doctree b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/VectorCSV.doctree
index b51f6f2f..c4133535 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/VectorCSV.doctree and b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/VectorCSV.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/VectorGRF.doctree b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/VectorGRF.doctree
index b587cd62..ac99f0f6 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/vector/implemented/VectorGRF.doctree and b/docs/html/.doctrees/sections/Dataloaders/vector/implemented/VectorGRF.doctree differ
diff --git a/docs/html/.doctrees/sections/Dataloaders/vector/index.doctree b/docs/html/.doctrees/sections/Dataloaders/vector/index.doctree
index 7ad493b5..d8988025 100644
Binary files a/docs/html/.doctrees/sections/Dataloaders/vector/index.doctree and b/docs/html/.doctrees/sections/Dataloaders/vector/index.doctree differ
diff --git a/docs/html/.doctrees/sections/Discrete_meshing.doctree b/docs/html/.doctrees/sections/Discrete_meshing.doctree
deleted file mode 100644
index 64641cb4..00000000
Binary files a/docs/html/.doctrees/sections/Discrete_meshing.doctree and /dev/null differ
diff --git a/docs/html/.doctrees/sections/Examples.doctree b/docs/html/.doctrees/sections/Examples.doctree
index e72b43a8..81271331 100644
Binary files a/docs/html/.doctrees/sections/Examples.doctree and b/docs/html/.doctrees/sections/Examples.doctree differ
diff --git a/docs/html/.doctrees/sections/Installation.doctree b/docs/html/.doctrees/sections/Installation.doctree
index 9b848a78..f6b071d9 100644
Binary files a/docs/html/.doctrees/sections/Installation.doctree and b/docs/html/.doctrees/sections/Installation.doctree differ
diff --git a/docs/html/.doctrees/sections/Outputs.doctree b/docs/html/.doctrees/sections/Outputs.doctree
index e6b5ca1e..9f643bdb 100644
Binary files a/docs/html/.doctrees/sections/Outputs.doctree and b/docs/html/.doctrees/sections/Outputs.doctree differ
diff --git a/docs/html/.doctrees/sections/Route_optimisation.doctree b/docs/html/.doctrees/sections/Route_optimisation.doctree
index 0fd2c990..23feadb1 100644
Binary files a/docs/html/.doctrees/sections/Route_optimisation.doctree and b/docs/html/.doctrees/sections/Route_optimisation.doctree differ
diff --git a/docs/html/.doctrees/sections/Vehicle_specifics.doctree b/docs/html/.doctrees/sections/Vehicle_specifics.doctree
index f67d823e..4441d148 100644
Binary files a/docs/html/.doctrees/sections/Vehicle_specifics.doctree and b/docs/html/.doctrees/sections/Vehicle_specifics.doctree differ
diff --git a/docs/html/.doctrees/sections/running.doctree b/docs/html/.doctrees/sections/running.doctree
deleted file mode 100644
index 05c3e3b0..00000000
Binary files a/docs/html/.doctrees/sections/running.doctree and /dev/null differ
diff --git a/docs/html/_sources/index.rst.txt b/docs/html/_sources/index.rst.txt
index ab8198a2..97b3ae7e 100644
--- a/docs/html/_sources/index.rst.txt
+++ b/docs/html/_sources/index.rst.txt
@@ -1,19 +1,34 @@
-Welcome RoutePlanner Manual Pages
-=====================================
+Welcome to the PolarRoute Manual Pages
+======================================
+
+PolarRoute is a tool for the optimisation of routes for maritime vehicles travelling in polar waters.
+This software package has been developed by the **British Antarctic Survey** (BAS), primarily
+for the optimisation of polar routes for the BAS research vessel RRS Sir David Attenborough,
+though it is applicable to any vessel (e.g. AUVs). The software is written in Python and is open source.
+
+For more information on the project, please visit the `PolarRoute website `_
+and follow our `GitHub repository `_.
+
+
+.. note:: The development of this codebase is ongoing and not yet complete.
+ Please contact the developers for more information.
Contents:
.. toctree::
- :maxdepth: 1
+ :maxdepth: 2
:numbered:
- ./sections/Code_structure
+ ./sections/Code_overview
./sections/Installation
- ./sections/running
- ./sections/Configuration
- ./sections/Dataloaders/overview
+ ./sections/ipython_notebooks
+ ./sections/Command_line_interface
+ ./sections/Configuration/Configuration_overview
./sections/Outputs
- ./sections/Examples
- ./sections/Discrete_meshing
+ ./sections/Dataloaders/overview
+ ./sections/Mesh_Construction/Mesh_construction_overview
./sections/Vehicle_specifics
+ ./sections/Route_calculation
./sections/Route_optimisation
+ ./sections/Examples
+
diff --git a/docs/html/_sources/sections/Configuration.rst.txt b/docs/html/_sources/sections/Configuration.rst.txt
deleted file mode 100644
index 89d14615..00000000
--- a/docs/html/_sources/sections/Configuration.rst.txt
+++ /dev/null
@@ -1,350 +0,0 @@
-.. _configuration:
-
-""""""""""""""""""""""""
-Input - Configuration
-""""""""""""""""""""""""
-
-In this section we will outline the standard structure for a configuration file used in all portions of the PolarRoute software package.
-
-Outlined below is an example configuration file for running PolarRoute. Using this as a template we will go through each of the definitions in turn, describing what each portion does with the subsections in the manual given by the main sections in the configuration file.
-
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-Mesh Construction configuration file example.
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-::
-
- "config": {
- "Mesh_info": {
- "Region": {
- "latMin": -65,
- "latMax": -60,
- "longMin": -70,
- "longMax": -50,
- "startTime": "2013-03-01",
- "endTime": "2013-03-14",
- "cellWidth": 5,
- "cellHeight": 2.5
- },
- "Data_sources": [
- {
- "loader": "GEBCO",
- "params": {
- "downsample_factors": [
- 5,
- 5
- ],
- "file": "../datastore/bathymetry/GEBCO/gebco_2022_n-40.0_s-90.0_w-140.0_e0.0.nc",
- "data_name": "elevation",
- "value_fill_types": "parent",
- "aggregate_type": "MAX",
- "splitting_conditions": [
- {
- "elevation": {
- "threshold": -10,
- "upper_bound": 1,
- "lower_bound": 0
- }
- }
- ]
- }
- },
- {
- "loader": "AMSR_folder",
- "params": {
- "folder": "../datastore/sic/amsr_south/",
- "hemisphere": "south",
- "value_fill_types": "parent",
- "data_name": "SIC",
- "splitting_conditions": [
- {
- "SIC": {
- "threshold": 35,
- "upper_bound": 0.9,
- "lower_bound": 0.1
- }
- }
- ]
- }
- },
- {
- "loader": "SOSE",
- "params": {
- "file": "../datastore/currents/sose_currents/SOSE_surface_velocity_6yearMean_2005-2010.nc",
- "value_fill_types": "parent",
- "data_name": "uC,vC"
- }
- },
- {
- "loader": "thickness",
- "params": {
- "data_name": "thickness",
- "file": "",
- "value_fill_types": "parent"
- }
- },
- {
- "loader": "density",
- "params": {
- "data_name": "density",
- "file": "",
- "value_fill_types": "parent"
- }
- }
- ],
- "splitting": {
- "split_depth": 4,
- "minimum_datapoints": 5
- }
- }
- }
-
-The configuration file used for mesh construction contains information required to build the discretised environment in which the route planner
-operates. Information here dictates the region in which the mesh is constructed, the data contained within
-the mesh and how the mesh is split to a non-uniform resolution. The configuration file used to generate a mesh is stored in a section titled 'Mesh_info'
-
-The 'Mesh_info' section of the configuration file contains three primary sections:
-
-################
-Region
-################
-The region section gives detailed information for the construction of the Discrete Mesh. The main definitions are the bounding region and temporal portion of interest (`longMin`, `latMin`, `longMax`, `latMax`, `startTime`, `endTime`), but also the starting shape of the spatial grid cell boxes (`cellWidth`, `cellHeight`) is defined before splitting is applied. Further detail on each parameter is given below:
-
-::
-
- "Region": {
- "latMin": -77.5,
- "latMax": -55,
- "longMin": -120,
- "longMax": -10,
- "startTime": "2017-02-01",
- "endTime": "2017-02-14",
- "cellWidth":5,
- "cellHeight":2.5
- }
-
-where the variables are as follows:
-
-* **longMin** *(float, degrees)* : Minimum Longitude Edge Mesh
-* **longMax** *(float, degrees)* : Maximum Longitude Edge Mesh
-* **latMin** *(float, degrees)* : Minimum Latitude Edge Mesh
-* **latMax** *(float, degrees)* : Maximum Latitude Edge Mesh
-* **startTime** *(string, 'YYYY-mm-dd')* : Start Datetime of Time averaging
-* **endTime** *(string, 'YYYY-mm-dd')* : End Datetime of Time averaging
-* **cellWidth** *(float, degrees)* : Initial Cell Box Width prior to splitting
-* **cellHeight** *(float, degrees)* : Initial Cell Box Height prior to splitting
-
-.. note::
- Variables **startTime** and **endTime** also support reference to system time using
- the keyword **TODAY** *e.g.*
-
- "startTime": "TODAY" , "endTime": "TODAY + 5"
-
- "startTime": "TODAY - 3", "endTime": "TODAY"
-
-#################
-Data_sources
-#################
-
-The 'Data_sources' section of the configuration file dictates which information will be added to the
-mesh when constructed. Each item in the list of data sources represents a single data set to be added
-to the mesh.
-
-::
-
- "Data_sources": [
- {
- "loader": "GEBCO",
- "params": {
- "downsample_factors": [
- 5,
- 5
- ],
- "file": "../datastore/bathymetry/GEBCO/gebco_2022_n-40.0_s-90.0_w-140.0_e0.0.nc",
- "data_name": "elevation",
- "value_fill_types": "parent",
- "aggregate_type": "MAX",
- "splitting_conditions": [
- {
- "elevation": {
- "threshold": -10,
- "upper_bound": 1,
- "lower_bound": 0
- }
- }
- ]
- }
- },
- {
- "loader": "AMSR_folder",
- "params": {
- "folder": "../datastore/sic/amsr_south/",
- "hemisphere": "south",
- "value_fill_types": "parent",
- "data_name": "SIC",
- "splitting_conditions": [
- {
- "SIC": {
- "threshold": 35,
- "upper_bound": 0.9,
- "lower_bound": 0.1
- }
- }
- ]
- }
- },
- {
- "loader": "SOSE",
- "params": {
- "file": "../datastore/currents/sose_currents/SOSE_surface_velocity_6yearMean_2005-2010.nc",
- "value_fill_types": "parent",
- "data_name": "uC,vC"
- }
- },
- {
- "loader": "thickness",
- "params": {
- "data_name": "thickness",
- "file": "",
- "value_fill_types": "parent"
- }
- },
- {
- "loader": "density",
- "params": {
- "data_name": "density",
- "file": "",
- "value_fill_types": "parent"
- }
- }
- ]
-
-
-where the variables are as follows:
-
-
-* **loader** *(string)* : The name of the data loader to be used to add this data source to the mesh
- see the :ref:`abstractScalarDataloader doc page` for further information about the available data loaders.
-* **params** *(dict)* : A dictionary containing optional parameters which may be required by the specified data loader in 'loader'. These parameters include the following:
-
- * **splitting_conditions** *(list)* : The conditions which determine if a cellbox should be split.
- * **threshold** *(float)* : The threshold above or below which CellBoxes will be sub-divided to separate the datapoints into homogeneous cells.
- * **upperBound** *(float)* : A percentage normalised between 0 and 1. A CellBox is deemed homogeneous if greater than this percentage of data points are above the given threshold.
- * **lowerBound** *(float)* : A percentage normalised between 0 and 1. A Cellbox is deemed homogeneous if less than this percentage of data points are below the given threshold.
- * **value_fill_types** *(string)* : Determines the actions taken if a cellbox is generated with no data. The possible values are either parent (which implies assigning the value of the parent cellbox), zero or nan.
- * **aggregate_type** *(string)* : Specifies how the data within a cellbox will be aggregated. By default aggregation takes place by calculating the mean of all data points within the CellBoxes bounds. *aggregate_type* allows this default to be changed to other aggregate function (e.g. MIN, MAX, COUNT).
-
-
-.. note::
- splitting conditions are applied in the order they are specified in the configuration file.
-
-
-##############
-splitting
-##############
-
-Non-uniform mesh refinement is done by selectively sub-dividing cells. Cell sub-division is performed
-whenever a cell (of any size) is determined to be inhomogeneous with respect to a specific characteristic
-of interest such as SIC or ocean depth (this characteristic is defined as a splitting condition inside the data source's params as illustrated above). For example, considering SIC, we define a range, from a lower bound
-*lb* to an upper bound *ub*, and a threshold, *t*. Then, a cell is considered inhomogeneous if between *lb* and *ub*
-of the ice measurements in that cell are at *t%* or higher. If the proportion of ice in the cell above the
-*t%* concentration is below *lb%*, we consider the cell to be homogeneous open water: such a cell can be navigated
-through so does not require splitting based on this homogeneity condition (though may still be split based on others).
-At the other end of the range, if the proportion is greater than *ub%*, then the cell is considered
-homogeneous ice: such a cell cannot be navigated through and will not be split on this or any subsequent splitting conditions.
-If the proportion is between these bounds, then the cell is inhomogeneous and must be split so that the homogeneous sub-cells can be found.
-
-The splitting section of the Configuration file defines the splitting parameters that are *common* across all the data sources and determines how the CellBoxes that form the
-Mesh will be sub-divided based on the homogeneity of the data points contained within to form a mesh
-of non-uniform spatial resolution.
-::
-
- "splitting": {
- "split_depth":4,
- "minimum_datapoints":5
- }
-
-where the variables are as follows:
-
-* **split_depth** *(float)* : The number of times the MeshBuilder will sub-divide each initial cellbox (subject to satisfying the splitting conditions of each data source)
-* **minimum_datapoints** *(float)* : The minimum number of datapoints a cellbox must contain for each value type to be able to split
-
-
-
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-Vessel Performance configuration file example.
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-The Vessel configuration file provides all the necessary information about the vessel that will execute
-the routes such that performance parameters (e.g. speed or fuel consumption) can be calculated by the
-`VesselPerformanceModeller` class. A file of this structure is also used as a command line argument for
-the 'add_vehicle' entry point.
-
-::
-
- {
- "VesselType": "SDA",
- "MaxSpeed": 26.5,
- "Unit": "km/hr",
- "Beam": 24.0,
- "HullType": "slender",
- "ForceLimit": 96634.5,
- "MaxIceConc": 80,
- "MinDepth": -10
- }
-
-Above are a typical set of configuration parameters used for a vessel where the variables are as follows:
-
-* **VesselType** *(string)* : The specific vessel class to use for performance modelling.
-* **MaxSpeed** *(float)* : The maximum speed of the vessel in open water.
-* **Unit** *(string)* : The units of measurement for the speed of the vessel (currently only "km/hr" is supported).
-* **Beam** *(float)* : The beam (width) of the ship in metres.
-* **HullType** *(string)* : The hull profile of the ship (should be one of either "slender" or "blunt").
-* **ForceLimit** *(float)* : The maximum allowed resistance force, specified in Newtons.
-* **MaxIceConc** *(float)* : The maximum Sea Ice Concentration the vessel is able to travel through given as a percentage.
-* **MinDepth** *(float)* : The minimum depth of water the vessel is able to travel through in metres. Negative values correspond to a depth below sea level.
-
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-Route Planning configuration file example.
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-::
-
- {
- "Route_Info": {
- "Objective_Function": "traveltime",
- "Path_Variables": [
- "fuel",
- "traveltime"
- ],
- "WayPoints": "./WayPoints_org.csv",
- "Source_Waypoints": ["LongPathStart"],
- "End_Waypoints": [],
- "Vector Names": ["uC","vC"],
- "Zero_Currents": false,
- "Variable_Speed": true,
- "Time_Unit": "days",
- "Early_Stopping_Criterion": true,
- "Save_Dijkstra_Graphs": false,
- "Smooth Path":{
- "Max Iteration Number":1000,
- "Minimum Difference": 1e-3
- }
- }
- }
-
-above is a typical set of configuration parameters used for route planning where the variables are as follows:
-
-* **objective_function** *(string)* : Defining the objective function to minimise for the construction of the mesh based Dijkstra routes. This variable can either be defined as 'traveltime' or 'fuel' .
-* **path_variables** *(list<(string)>)* : A list of strings of the route variables to return in the output geojson.
-* **waypoints_path** *(string)* : A filepath to a CSV containing the user defined waypoints with columns including: 'Name','Lat',"Long"
-* **source_waypoints** *(list<(string)>)*: The source waypoints to define the routes from. The names in this list must be the same as names within the `waypoints_path` file. If left blank then routes will be determined from all waypoints.
-* **end_waypoints** *(list<(string)>)* : The end waypoints to define the routes to. The names in this list must be the same as names within the `waypoints_path` file. If left blank then routes will be determined to all waypoints.
-* **vector_names** *(list<(string)>)* : The definition of the horizontal and vertical components of the vector acting on the ship within each CellBox. These names must be within the 'cellboxes'.
-* **zero_currents** *(bool)* : For development use only. Removes the effect of currents acting on the ship, setting all current vectors to zero.
-* **Variable_Speed** *(bool)* : For development use only. Removes the effect of variable speed acting on the ship, ship speed set to max speed defined by 'Vessel':{'Speed':...}.
-* **time_unit** *(string)* : The time unit to output the route path information. Currently only takes 'days', but will support 'hrs' in future releases.
-* **early_stopping_criterion** *(bool)* : For development use only. Dijkstra early stopping criterion. For development use only if the full objective_function from each starting waypoint is required. Should be used in conjunction with `save_dijkstra_graphs`.
-* **save_dijkstra_graphs** *(bool)* : For development use only. Saves the full dijkstra graph representing the objective_function value across all mesh cells.
-* **Smooth Path**
- * **max_iteration_number** *(int)* : For development use only. Maximum number of iterations in the path smoothing. For most paths convergence is met 100x earlier than this value.
- * **minimum_difference** *(float)* : For development use only. Minimum difference between two path smoothing iterations before convergence is triggered
-
diff --git a/docs/html/_sources/sections/Dataloaders/DataLoaderInterface.rst.txt b/docs/html/_sources/sections/Dataloaders/DataLoaderInterface.rst.txt
index 33481f32..a45467fd 100644
--- a/docs/html/_sources/sections/Dataloaders/DataLoaderInterface.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/DataLoaderInterface.rst.txt
@@ -9,5 +9,5 @@ only get_hom_condition() and get_value() are needed realistically. Other methods
implemented in the :ref:`abstractScalar` and
:ref:`abstractVector` dataloaders.
-.. automodule:: polar_route.dataloaders.dataLoaderInterface
+.. automodule:: polar_route.dataloaders.dataloader_interface
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/Factory.rst.txt b/docs/html/_sources/sections/Dataloaders/Factory.rst.txt
index 3207bae1..51fbe999 100644
--- a/docs/html/_sources/sections/Dataloaders/Factory.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/Factory.rst.txt
@@ -22,14 +22,16 @@ a parameter the dataloader requires. The actions are:
#. Import the dataloader
#. Add an entry to the :code:`dataloader_requirements` dictionary
-#. (OPTIONAL) Add a default value to :code:`set_default_params()`
^^^^^^^
Example
^^^^^^^
In this example, a new scalar dataloader `myScalarDataloader` has been created, and
is located at :code:`polar_route/Dataloaders/Scalar/myScalarDataloader.py`.
-The only parameter required by this dataloader is a file to read data from::
+
+The only parameter required by this dataloader is a file to read data from. 'files'
+is passed as a mandatory parameter, as 'file' and 'folder' both get translated into
+a list of files, and stored in params under the key 'files'::
# Add new import statement for Factory to read
from polar_route.Dataloaders.Scalar.myScalarDataloader import myScalarDataloader
@@ -42,20 +44,27 @@ The only parameter required by this dataloader is a file to read data from::
...
dataloader_requirements = {
...
- # Add new dataloader
- 'myscalar': (myScalarDataloader, ['file'])
+ # Add new dataloaders
+ 'myscalar': (myScalarDataloader, ['files'])
...
...
...
To call this dataloader, add an entry in the :code:`config.json`
-file used to generate the mesh::
+file used to generate the mesh. Alternatively, add a folder, or a list of
+individual files::
{
"loader": "myscalar",
"params": {
- "file": "PATH_TO_DATA_FILE",
+ "file": "PATH_TO_DATA_FILE" # For a single file
+ "folder": "PATH_TO_FOLDER" # For a folder, must have trailing '/'
+ "files":[ # For a list of individual files
+ "PATH_TO_FILE_1",
+ "PATH_TO_FILE_2",
+ ...
+ ]
}
}
diff --git a/docs/html/_sources/sections/Dataloaders/overview.rst.txt b/docs/html/_sources/sections/Dataloaders/overview.rst.txt
index cd54c8f5..2fed490c 100644
--- a/docs/html/_sources/sections/Dataloaders/overview.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/overview.rst.txt
@@ -12,9 +12,9 @@ Dataloader Overview
./Factory
./scalar/index
./vector/index
+ ./AddingDataloaders
-################
Section Overview
################
@@ -35,35 +35,17 @@ however this can be whatever the user needs, so long as they are cast into eithe
*UML Diagram detailing the dataloader subsystem*
-
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-Implementing New Dataloaders
-^^^^^^^^^^^^^^^^^^^^^^^^^^^^
-
-Each dataloader is to be implemented as a separate object for the Environmental mesh to interface with.
-The general workflow for creating a new dataloader is as follows:
-
-#. Choose an approriate dataloader type (see `Dataloader Types`_).
-#. Create a new file under :code:`polar_route/DataLoaders/{dataloader-type}` with an appropriate name.
-#. Create :code:`__init__()` and :code:`import_data()` methods. Examples of how to do this are shown on the :ref:`abstractScalar` and :ref:`abstractVector` pages.
-#. Add a new entry to the dataloader factory object, within :code:`polar_route/Dataloaders/Factory.py`. Instructions on how to do so are shown in :ref:`dataloader-factory`
-
-After performing these actions, the dataloader should be ready to go. It is useful for debugging purposes
-to create the dataloader object from within :code:`polar_route/Dataloaders/Factory.py` (e.g. within
-:code:`if __name__=='__main__':` ) and test its functionality before deploying it.
-
-^^^^^^^^^^^^^^^^
Dataloader Types
-^^^^^^^^^^^^^^^^
+================
There are two main types of dataloaders that are implemented as abstract classes: Scalar and Vector.
**Scalar dataloaders** are to be used on scalar datasets; i.e. variables with a single value
per latitude/longitude(/time) coordinate. Examples of this are bathymetry, sea ice concentration, etc...
-While the raw datasets may contain more than one variable (a common example being the existance of values and errors in the same file),
+While the raw datasets may contain more than one variable (a common example being the existence of values and errors in the same file),
these *MUST* be cut down to just coordinates, and a single variable, in order to work correctly with the :ref:`abstractScalar` dataloader.
-To read more on how to implement these, follow instructions in `Implementing New Dataloaders`_ and the :ref:`abstract scalar dataloader page`.
+To read more on how to implement these, follow instructions in :ref:`Adding Dataloaders page` and the :ref:`abstract scalar dataloader page`.
**Vector dataloaders** are to be used on vector datasets; i.e. variables with multi-dimensional values
per latitude/longitude(/time) coordinate. Examples of this are ocean currents,
@@ -71,27 +53,38 @@ wind, etc... The datasets will have multiple data variables, and should be cut d
and optionally 'time'), and the values for each dimensional component of the variable. This will generally be two dimensions,
however the :ref:`abstractVector` dataloader should be flexible to n-dimensional data.
Rigor should be taken when testing these dataloaders to ensure that the outputs of :code:`get_value()` method of these dataloaders produces outputs that make sense.
-To read more on how to implement these, follow instructions in `Implementing New Dataloaders`_ and :ref:`abstract vector dataloader page`.
+To read more on how to implement these, follow instructions in :ref:`Adding Dataloaders page` and :ref:`abstract vector dataloader page`.
.. **Look-up Table Dataloaders** are to be used on datasets where boundaries define a value.
-.. Real data is always prefered to this method, however in the case where there is no data, the LUT
+.. Real data is always preferred to this method, however in the case where there is no data, the LUT
.. can provide an alternative. Examples of this include ice density, and ice thickness. For these examples,
.. weather conditions dictate their values, and these weather conditions can be localised to specific areas.
.. To read more on how to implement these, follow instructions in `Implementing New Dataloaders`_ and :ref:`abstract LUT dataloader page`.
-^^^^^^^^^^^^^^^^^^^^
+
Abstract Dataloaders
-^^^^^^^^^^^^^^^^^^^^
+====================
To look at specific abstract dataloaders, use the following links:
- :ref:`abstract-scalar-dataloader`
- :ref:`abstract-vector-dataloader`
These are the templates to be used when implementing new dataloaders into PolarRoute.
-They have been split into two seperate categories: Scalar and Vector, detailed in `Dataloader Types`_.
+They have been split into two separate categories: Scalar and Vector, detailed in `Dataloader Types`_.
The abstract classes generalise the methods used by each dataloader type to produce outputs
that the Environmental Mesh can retrieve via the :ref:`dataloader interface`.
-They are flexible in that they can store and process data as both :code:`pandas.DataFrame`'s or
-:code:`xarray.Dataset`'s.
\ No newline at end of file
+They are flexible in that they can store and process data as both :code:`xarray.Dataset`'s or
+:code:`pandas.DataFrame`'s (and by extension, :code:`dask.DataFrames`'s).
+When creating your own, :code:`dask` and :code:`xarray` should be utilised as much as possible to
+reduce memory consumption.
+
+Both abstract base classes define the :code:`__init__()` function to have the following process:
+
+#. Read in params from config
+#. Add params from :code:`self.add_default_params()`, defined by user when creating a dataloader
+#. Downsample data if required and if loaded as :code:`xarray.Dataset`
+#. Reproject data if required
+#. Trim datapoints to initial boundary
+#. Rename data column name if defined in params
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/scalar/abstractScalar.rst.txt b/docs/html/_sources/sections/Dataloaders/scalar/abstractScalar.rst.txt
index 6d633f54..c625469a 100644
--- a/docs/html/_sources/sections/Dataloaders/scalar/abstractScalar.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/scalar/abstractScalar.rst.txt
@@ -4,6 +4,6 @@
Abstract Scalar Dataloader
**************************
-.. automodule:: polar_route.dataloaders.scalar.abstractScalar
+.. automodule:: polar_route.dataloaders.scalar.abstract_scalar
:special-members: __init__
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/scalar/implemented/AMSR.rst.txt b/docs/html/_sources/sections/Dataloaders/scalar/implemented/AMSR.rst.txt
index 58619970..01642a27 100644
--- a/docs/html/_sources/sections/Dataloaders/scalar/implemented/AMSR.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/scalar/implemented/AMSR.rst.txt
@@ -2,12 +2,12 @@
AMSR Dataloader
***************
-The AMSR (Advanced Microwave Scanning Radiometer) dataset is a publically
+The AMSR (Advanced Microwave Scanning Radiometer) dataset is a publicly
available that provides Sea Ice Concentration scans of the earth's oceans.
It is produced by researchers at the University of Bremen.
The AMSR dataloader is currently the only 'standalone' dataloader, in that it
-is defined independantly of the abstract base class. This is due to issues
+is defined independently of the abstract base class. This is due to issues
with :code:`pandas` calculating mean values differently depending on how the
data is loaded. This caused issues with the regression tests passing.
This issue will be rectified soon by updating the regression tests.
diff --git a/docs/html/_sources/sections/Dataloaders/scalar/implemented/BSOSEDepth.rst.txt b/docs/html/_sources/sections/Dataloaders/scalar/implemented/BSOSEDepth.rst.txt
index 85f18aa9..55c86b8d 100644
--- a/docs/html/_sources/sections/Dataloaders/scalar/implemented/BSOSEDepth.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/scalar/implemented/BSOSEDepth.rst.txt
@@ -2,7 +2,7 @@
BSOSE Depth Dataloader
**********************
-B-SOSE (Biogeochemical Southern Ocean State Estimate solution) provide a publically available dataset that
+B-SOSE (Biogeochemical Southern Ocean State Estimate solution) provide a publicly available dataset that
hosts (amongst other products) sea ice concentration (SIC) of the southern ocean. Their SIC product provides
a 'depth' value, which this dataloader ingests.
BSOSE is an extension of the SOSE project led by Mazloff at the Scripps Institution of Oceanography.
@@ -17,6 +17,6 @@ Data can be downloaded from `here `_
-.. automodule:: polar_route.dataloaders.scalar.balticSeaIce
+.. automodule:: polar_route.dataloaders.scalar.baltic_sea_ice
:special-members: __init__
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/scalar/implemented/BinaryGRF.rst.txt b/docs/html/_sources/sections/Dataloaders/scalar/implemented/BinaryGRF.rst.txt
index 009ec3e1..2ac5f044 100644
--- a/docs/html/_sources/sections/Dataloaders/scalar/implemented/BinaryGRF.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/scalar/implemented/BinaryGRF.rst.txt
@@ -23,7 +23,7 @@ only True/False. It is useful for generating land masks.
# distribution used to generate GRF
"min": 0, # - Minimum value of GRF
"max": 1, # - Maximum value of GRF
- "binary": True, # - Flag specifiying this GRF is a binary mask
+ "binary": True, # - Flag specifying this GRF is a binary mask
"threshold": 0.5 # - Value around which mask values are set.
# Below this, values are set to False
# Above this, values are set to True
diff --git a/docs/html/_sources/sections/Dataloaders/scalar/implemented/GEBCO.rst.txt b/docs/html/_sources/sections/Dataloaders/scalar/implemented/GEBCO.rst.txt
index 8c31519d..2449b93f 100644
--- a/docs/html/_sources/sections/Dataloaders/scalar/implemented/GEBCO.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/scalar/implemented/GEBCO.rst.txt
@@ -2,7 +2,7 @@
GEBCO Dataloader
****************
-The General Bathymetric Chart of the Oceans (GEBCO) is a publically available
+The General Bathymetric Chart of the Oceans (GEBCO) is a publicly available
bathymetric chart of the Earth's oceans. It is a common resource used by
ocean scientists, amongst others.
diff --git a/docs/html/_sources/sections/Dataloaders/scalar/implemented/IceNet.rst.txt b/docs/html/_sources/sections/Dataloaders/scalar/implemented/IceNet.rst.txt
index 03fd7cc9..9aed84b2 100644
--- a/docs/html/_sources/sections/Dataloaders/scalar/implemented/IceNet.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/scalar/implemented/IceNet.rst.txt
@@ -12,7 +12,7 @@ at the British Antarctic Survey. From the website:
next 6 months of monthly-averaged sea ice concentration maps.
Data for IceNet V1 is available from `here `_
-Data for IceNet V2 is not publically available.
+Data for IceNet V2 is not publicly available.
.. automodule:: polar_route.dataloaders.scalar.icenet
:special-members: __init__
diff --git a/docs/html/_sources/sections/Dataloaders/scalar/implemented/ScalarCSV.rst.txt b/docs/html/_sources/sections/Dataloaders/scalar/implemented/ScalarCSV.rst.txt
index aff47518..f81b42da 100644
--- a/docs/html/_sources/sections/Dataloaders/scalar/implemented/ScalarCSV.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/scalar/implemented/ScalarCSV.rst.txt
@@ -7,6 +7,6 @@ it into a data source for mesh construction. It was primarily used in testing
for loading dummy data to test performance. As such, there is no data source
for this dataloader.
-.. automodule:: polar_route.dataloaders.scalar.scalarCSV
+.. automodule:: polar_route.dataloaders.scalar.scalar_csv
:special-members: __init__
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/scalar/implemented/ScalarGRF.rst.txt b/docs/html/_sources/sections/Dataloaders/scalar/implemented/ScalarGRF.rst.txt
index 367438e0..478f1c40 100644
--- a/docs/html/_sources/sections/Dataloaders/scalar/implemented/ScalarGRF.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/scalar/implemented/ScalarGRF.rst.txt
@@ -23,7 +23,7 @@ For vector fields, see the :ref:`Vector GRF page`.
"size": 512, # - Number of datapoints per lat/long axis
"alpha": 3, # - Power of the power-law momentum
# distribution used to generate GRF
- "binary": False, # - Flag specifiying this GRF isn't a binary mask
+ "binary": False, # - Flag specifying this GRF isn't a binary mask
"threshold": [0, 1], # - Caps of min/max values to ensure normalising
# not skewed by outlier in randomised GRF
"min": -10, # - Minimum value of GRF
@@ -41,6 +41,6 @@ the min and max are
The dataloader is implemented as follows:
-.. automodule:: polar_route.dataloaders.scalar.scalarGRF
+.. automodule:: polar_route.dataloaders.scalar.scalar_grf
:special-members: __init__
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/scalar/index.rst.txt b/docs/html/_sources/sections/Dataloaders/scalar/index.rst.txt
index 69f83d47..d93966b4 100644
--- a/docs/html/_sources/sections/Dataloaders/scalar/index.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/scalar/index.rst.txt
@@ -20,7 +20,7 @@ functionality that would be needed to manipulate the data to work
with the mesh. When creating a new dataloader, the user must define
how to open the data files, and what methods are required to manipulate
the data into a standard format. More details are provided on the
-:ref:`abstractScalar doc page`
+:ref:`abstractScalar doc page`.
^^^^^^^^^^^^^^^^^^^^^^^^^^
Scalar Dataloader Examples
@@ -33,80 +33,46 @@ Below is a simple example of how to load in a NetCDF file::
import logging
class MyDataLoader(ScalarDataLoader):
- def __init__(self, bounds, params):
- # Creates a class attribute for all keys in params
- for key, val in params.items():
- logging.debug(
- f"Reading in {key}:{value} (dtype={type(value)}) from params"
- )
- setattr(self, key, val)
-
- # Import data from file
- self.data = self.import_data(bounds)
-
- # Retrieve data name from variable name in NetCDF
- self.data_name = self.get_data_col_name()
-
- logging.info(f"Successfully loaded {self.data_name} from {self.file}")
-
+
def import_data(self, bounds):
logging.debug("Importing my data...")
# Open Dataset
- logging.debug(f"- Opening file {self.file}")
- data = xr.open_dataset(self.file)
+ if len(self.files) == 1: data = xr.open_dataset(self.files[0])
+ else: data = xr.open_mfdataset(self.files)
# Rename coordinate columns to 'lat', 'long', 'time' if they aren't already
data = data.rename({'lon':'long'})
# Limit to initial boundary
- data = data.sel(lat=slice(bounds.get_lat_min(),bounds.get_lat_max()))
- data = data.sel(long=slice(bounds.get_long_min(),bounds.get_long_max()))
- data = data.sel(time=slice(bounds.get_time_min(),bounds.get_time_max()))
+ data = self.trim_data(bounds, data=data)
return data
-Sometimes the data needs to be reprojected if it is not initially in mercator
-projection. It may also need to be downsampled if the dataset is very large.
-The following code handles both of these cases::
+Sometimes there are parameters that are constant for a data source, but are not
+constant for all data sources. Default values are defined in the dataloader :code:`add_default_params()`.
+Below is an example of setting default parameters for reprojection of a dataset::
class MyDataLoader(ScalarDataLoader):
- def __init__(self, bounds, params):
- # Creates a class attribute for all keys in params
- for key, val in params.items():
- logging.debug(
- f"Reading in {key}:{value} (dtype={type(value)}) from config params"
- )
- setattr(self, key, val)
-
- # Import data from file
- self.data = self.import_data(bounds)
- # Downsampling data by 'downsample_factors' defined in config params
- self.data = self.downsample()
- # Reprojecting dataset from EPSG:3412 to 'EPSG:4326'.
- # Coordinate names 'x', 'y' will be replaced with 'long', 'lat'
- self.data = self.reproject( in_proj = 'EPSG:3412',
- out_proj = 'EPSG:4326',
- x_col = 'x',
- y_col = 'y')
-
- # Limit to initial boundary.
- # Note: Reprojection converts data to pandas dataframe
- idx = self.get_datapoints(bounds).index
- self.data = self.data.loc[idx]
-
- # Manually overwriting data name
- self.data_name = "my_variable"
- self.data = self.set_data_col_name(self.data_name)
+ def add_default_params(self, params):
+ # Add all the regular default params that scalar dataloaders have
+ params = super().add_default_params(params) # This line MUST be included
+
+ # Define projection of dataset being imported
+ params['in_proj'] = 'EPSG:3412'
+ # Define projection required by output
+ params['out_proj'] = 'EPSG:4326' # default is EPSG:4326, so strictly
+ # speaking this line is not necessary
+
+ # Coordinates in dataset that will be reprojected into long/lat
+ params['x_col'] = 'x' # Becomes 'long'
+ params['y_col'] = 'y' # Becomes 'lat'
- logging.info(f"Successfully loaded {self.data_name} from {self.file}")
+ return params
def import_data(self, bounds):
- logging.debug("Importing my data...")
-
# Open Dataset
- logging.debug(f"- Opening file {self.file}")
- data = xr.open_dataset(self.file)
+ data = xr.open_mfdataset(self.files)
# Can't easily determine bounds of data in wrong projection, so skipping for now
return data
diff --git a/docs/html/_sources/sections/Dataloaders/vector/abstractVector.rst.txt b/docs/html/_sources/sections/Dataloaders/vector/abstractVector.rst.txt
index 20301c6a..2b857973 100644
--- a/docs/html/_sources/sections/Dataloaders/vector/abstractVector.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/vector/abstractVector.rst.txt
@@ -4,6 +4,6 @@
Abstract Vector Dataloader
**************************
-.. automodule:: polar_route.dataloaders.vector.abstractVector
+.. automodule:: polar_route.dataloaders.vector.abstract_vector
:special-members: __init__
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/vector/implemented/BalticCurrent.rst.txt b/docs/html/_sources/sections/Dataloaders/vector/implemented/BalticCurrent.rst.txt
index fbaceece..678b503b 100644
--- a/docs/html/_sources/sections/Dataloaders/vector/implemented/BalticCurrent.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/vector/implemented/BalticCurrent.rst.txt
@@ -15,6 +15,6 @@ From their webpage:
Data can be downloaded from `here `_
-.. automodule:: polar_route.dataloaders.vector.balticCurrent
+.. automodule:: polar_route.dataloaders.vector.baltic_current
:special-members: __init__
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/vector/implemented/ERA5Wind.rst.txt b/docs/html/_sources/sections/Dataloaders/vector/implemented/ERA5Wind.rst.txt
index 6e1c7b3a..437d45c6 100644
--- a/docs/html/_sources/sections/Dataloaders/vector/implemented/ERA5Wind.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/vector/implemented/ERA5Wind.rst.txt
@@ -17,6 +17,6 @@ Instructions for how to download their data products are
available `here `_
-.. automodule:: polar_route.dataloaders.vector.era5Wind
+.. automodule:: polar_route.dataloaders.vector.era5_wind
:special-members: __init__
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/vector/implemented/NorthSeaCurrent.rst.txt b/docs/html/_sources/sections/Dataloaders/vector/implemented/NorthSeaCurrent.rst.txt
index 79df5c5e..0a08b963 100644
--- a/docs/html/_sources/sections/Dataloaders/vector/implemented/NorthSeaCurrent.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/vector/implemented/NorthSeaCurrent.rst.txt
@@ -10,6 +10,6 @@ More information on where to download the data is
available `here `_
-.. automodule:: polar_route.dataloaders.vector.northSeaCurrent
+.. automodule:: polar_route.dataloaders.vector.north_sea_current
:special-members: __init__
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/vector/implemented/ORAS5Current.rst.txt b/docs/html/_sources/sections/Dataloaders/vector/implemented/ORAS5Current.rst.txt
index 7c99ea03..c85cd2d5 100644
--- a/docs/html/_sources/sections/Dataloaders/vector/implemented/ORAS5Current.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/vector/implemented/ORAS5Current.rst.txt
@@ -2,7 +2,7 @@
ORAS5 Currents Dataloader
*************************
-Ocean Reanalysis System 5 (ORAS5) is a publically available dataset providing
+Ocean Reanalysis System 5 (ORAS5) is a publicly available dataset providing
estimated values for many different ocean parameters, including ocean currents.
From their website:
@@ -15,6 +15,6 @@ From their website:
Data can be downloaded from `here `_
-.. automodule:: polar_route.dataloaders.vector.oras5Current
+.. automodule:: polar_route.dataloaders.vector.oras5_current
:special-members: __init__
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/vector/implemented/SOSE.rst.txt b/docs/html/_sources/sections/Dataloaders/vector/implemented/SOSE.rst.txt
index 6e171323..da065eae 100644
--- a/docs/html/_sources/sections/Dataloaders/vector/implemented/SOSE.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/vector/implemented/SOSE.rst.txt
@@ -2,7 +2,7 @@
SOSE Currents Dataloader
************************
-Southern Ocean State Estimate (SOSE) is a publically available dataset that provides (amongst other products)
+Southern Ocean State Estimate (SOSE) is a publicly available dataset that provides (amongst other products)
ocean current vectors of the southern ocean. It is a project led by Mazloff at the Scripps Institution of Oceanography.
From their website:
diff --git a/docs/html/_sources/sections/Dataloaders/vector/implemented/VectorCSV.rst.txt b/docs/html/_sources/sections/Dataloaders/vector/implemented/VectorCSV.rst.txt
index ec7d3c8f..d7276210 100644
--- a/docs/html/_sources/sections/Dataloaders/vector/implemented/VectorCSV.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/vector/implemented/VectorCSV.rst.txt
@@ -7,6 +7,6 @@ it into a data source for mesh construction. It was primarily used in testing
for loading dummy data to test performance. As such, there is no data source
for this dataloader.
-.. automodule:: polar_route.dataloaders.vector.vectorCSV
+.. automodule:: polar_route.dataloaders.vector.vector_csv
:special-members: __init__
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/vector/implemented/VectorGRF.rst.txt b/docs/html/_sources/sections/Dataloaders/vector/implemented/VectorGRF.rst.txt
index e369c8d7..65db9138 100644
--- a/docs/html/_sources/sections/Dataloaders/vector/implemented/VectorGRF.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/vector/implemented/VectorGRF.rst.txt
@@ -33,6 +33,6 @@ For scalar fields, see the :ref:`Vector GRF page`.
The dataloader is implemented as follows:
-.. automodule:: polar_route.dataloaders.vector.vectorGRF
+.. automodule:: polar_route.dataloaders.vector.vector_grf
:special-members: __init__
:members:
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Dataloaders/vector/index.rst.txt b/docs/html/_sources/sections/Dataloaders/vector/index.rst.txt
index 3a8ceec5..22f15064 100644
--- a/docs/html/_sources/sections/Dataloaders/vector/index.rst.txt
+++ b/docs/html/_sources/sections/Dataloaders/vector/index.rst.txt
@@ -28,7 +28,7 @@ Vector Dataloader Examples
Creating a vector dataloader is almost identical to creating a
:ref:`scalar dataloader`. The key differences
are that the `VectorDataLoader` abstract base class must be used, and that
-the `data_name` is a comma seperated string of the vector component names.
+the `data_name` is a comma separated string of the vector component names.
e.g. a dataloader storing a vector with column names :code:`uC` and
:code:`vC` will have an attribute :code:`self.data_name = 'uC,vC'`
Data must be imported and saved as an xarray.Dataset, or a
@@ -40,22 +40,6 @@ NetCDF file::
import logging
class MyDataLoader(VectorDataLoader):
- def __init__(self, bounds, params):
- # Creates a class attribute for all keys in params
- for key, val in params.items():
- logging.debug(
- f"Reading in {key}:{value} (dtype={type(value)}) from params"
- )
- setattr(self, key, val)
-
- # Import data from file
- self.data = self.import_data(bounds)
-
- # Retrieve data name from variable name in NetCDF
- self.data_name = self.get_data_col_name()
-
- logging.info(f"Successfully loaded {self.data_name} from {self.file}")
-
def import_data(self, bounds):
logging.debug("Importing my data...")
# Open Dataset
@@ -66,53 +50,34 @@ NetCDF file::
data = data.rename({'lon':'long'})
# Limit to initial boundary
- data = data.sel(lat=slice(bounds.get_lat_min(),bounds.get_lat_max()))
- data = data.sel(long=slice(bounds.get_long_min(),bounds.get_long_max()))
- data = data.sel(time=slice(bounds.get_time_min(),bounds.get_time_max()))
+ data = self.trim_data(bounds, data=data)
return data
-Similar to scalar data loaders, sometimes the data needs to be reprojected
-if it is not initially in mercator projection. It may also need to be downsampled
-if the dataset is very large. The following code handles both of these cases::
+Similar to scalar data loaders, sometimes there are parameters that are constant
+for a data source, but are not constant for all data sources. Default values may
+be defined either in the dataloader factory, or within the dataloader itself.
+Below is an example of setting default parameters for reprojection of a dataset::
class MyDataLoader(ScalarDataLoader):
- def __init__(self, bounds, params):
- # Creates a class attribute for all keys in params
- for key, val in params.items():
- logging.debug(
- f"Reading in {key}:{value} (dtype={type(value)}) from config params"
- )
- setattr(self, key, val)
-
- # Import data from file
- self.data = self.import_data(bounds)
- # Downsampling data by 'downsample_factors' defined in config params
- self.data = self.downsample()
- # Reprojecting dataset from EPSG:3412 to 'EPSG:4326'.
- # Coordinate names 'x', 'y' will be replaced with 'long', 'lat'
- self.data = self.reproject( in_proj = 'EPSG:3412',
- out_proj = 'EPSG:4326',
- x_col = 'x',
- y_col = 'y')
-
- # Limit to initial boundary.
- # Note: Reprojection converts data to pandas dataframe
- idx = self.get_datapoints(bounds).index
- self.data = self.data.loc[idx]
-
- # Manually overwriting data names for each vector component
- self.data_name = "v_x,v_y"
- self.data = self.set_data_col_name(self.data_name)
- logging.info(f"Successfully loaded {self.data_name} from {self.file}")
+ def add_default_params(self, params):
+ # Add all the regular default params that scalar dataloaders have
+ params = super().add_default_params(params) # This line MUST be included
+
+ # Define projection of dataset being imported
+ params['in_proj'] = 'EPSG:3412'
+ # Define projection required by output
+ params['out_proj'] = 'EPSG:4326' # default is EPSG:4326, so strictly
+ # speaking this line is not necessary
+
+ # Coordinates in dataset that will be reprojected into long/lat
+ params['x_col'] = 'x' # Becomes 'long'
+ params['y_col'] = 'y' # Becomes 'lat'
def import_data(self, bounds):
- logging.debug("Importing my data...")
-
# Open Dataset
- logging.debug(f"- Opening file {self.file}")
- data = xr.open_dataset(self.file)
+ data = xr.open_mfdataset(self.file)
# Can't easily determine bounds of data in wrong projection, so skipping for now
return data
diff --git a/docs/html/_sources/sections/Discrete_meshing.rst.txt b/docs/html/_sources/sections/Discrete_meshing.rst.txt
deleted file mode 100644
index 662324d0..00000000
--- a/docs/html/_sources/sections/Discrete_meshing.rst.txt
+++ /dev/null
@@ -1,99 +0,0 @@
-********************************
-Methods - Mesh Construction
-********************************
-
-Throughout this section we will outline an overview of the Environment Mesh Construction module, describe the main classes that composes the module and illustrate a use case for the Discrete Meshing of the environment.
-
-Mesh Construction Overview
-##############################
-A general overview of the method can be seen below:
-
-.. figure:: ./Figures/FlowDiagram_MeshGraph.png
- :align: center
- :width: 700
-
- Overview figure of the Discrete Meshing from the multi-data input.
-
-
-Mesh Construction Design
-##############################
-The below UML diagram describes how the Environment Mesh Construction module is designed. It depicts the classes of the module and how they interact with each other.
-
-.. figure:: ./Figures/mesh-construct-UML.drawio.png
- :align: center
- :width: 1000
-
-
-
-
-Mesh Construction Use case
-###################################
-This sequence diagram illustrates a use case for the Discrete Meshing of the environment, where the module's client starts by initializing the MeshBuilder with a certain mesh configuration (see Input-Configuration section for more details about the configuration format) then calls build_environment_mesh method.
-
-.. figure:: ./Figures/mesh-build-sequence-diagram.drawio.png
- :align: center
- :width: 1000
-
-
-
-The following diagram depicts the sequence of events that take place inside build_environment_mesh method into details
-
-.. figure:: ./Figures/build-env-mesh.drawio.png
- :align: center
- :width: 1000
-
-
-Classes
-##############
-This section describes the main classes of the Mesh Construction module in details
-
-CellBox
-***************
-.. automodule:: polar_route.mesh_generation.cellbox
-
-.. autoclass:: polar_route.mesh_generation.cellbox.CellBox
- :special-members: __init__
- :members: set_data_source, should_split, split, set_parent, aggregate
-
-MetaData
-***********
-
-.. automodule:: polar_route.mesh_generation.metadata
-
-.. autoclass:: polar_route.mesh_generation.metadata.Metadata
- :special-members: __init__
-
-
-MeshBuilder
-************
-
-.. automodule:: polar_route.mesh_generation.mesh_builder
-
-.. autoclass:: polar_route.mesh_generation.mesh_builder.MeshBuilder
- :special-members: __init__
- :members: build_environmental_mesh , split_and_replace, split_to_depth
-
-AggregatedCellBox
-******************
-.. automodule:: polar_route.mesh_generation.aggregated_cellBox
-
-.. autoclass:: polar_route.mesh_generation.aggregated_cellBox.AggregatedCellBox
- :special-members: __init__
- :members: contains_point, to_json
-
-EnvironmentMesh
-****************
-.. automodule:: polar_route.mesh_generation.environment_mesh
-
-.. autoclass:: polar_route.mesh_generation.environment_mesh.EnvironmentMesh
- :special-members: __init__
- :members: load_from_json, update_cellbox , to_json, save
-
-
-NeighbourGraph
-***************
-
-.. automodule:: polar_route.mesh_generation.neighbour_graph
-
-.. autoclass:: polar_route.mesh_generation.neighbour_graph.NeighbourGraph
- :members: initialise_neighbour_graph, get_neighbour_case, update_neighbours
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Examples.rst.txt b/docs/html/_sources/sections/Examples.rst.txt
index 8b2c9d0a..752e6ea4 100644
--- a/docs/html/_sources/sections/Examples.rst.txt
+++ b/docs/html/_sources/sections/Examples.rst.txt
@@ -1,7 +1,7 @@
************************
Examples of running code
************************
-Throughout this section we will discuss a series of examples that cna be run by the user either from the Command-Ling or within python/ipython notebooks. All these examples and more can be found at the Google Colab `Link `_.
+Throughout this section we will discuss a series of examples that can be run by the user either from the Command-Line or within python/ipython notebooks. All these examples and more can be found at the Google Colab `Link `_.
=============================================================
Example 1 - Antarctica Processing all stages of route planner
@@ -17,6 +17,6 @@ Example 2 - Dijkstra vs smooth paths
============================================================
-Example 2 - Variations of Vehichle Properties on Route Paths
+Example 2 - Variations of Vehicle Properties on Route Paths
============================================================
...to be added shortly ...
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Installation.rst.txt b/docs/html/_sources/sections/Installation.rst.txt
index 434e2346..b1f20ca1 100644
--- a/docs/html/_sources/sections/Installation.rst.txt
+++ b/docs/html/_sources/sections/Installation.rst.txt
@@ -6,38 +6,76 @@ In this section we will outline the installation steps for installing the softwa
The first stage is installing a version of Python 3.9, if you don't have a working version. We suggest installing a working Anaconda distribution from https://www.anaconda.com/products/distribution#macos following the instructions on that page.
-Windows
-#######
-The PolarRoute software requires GDAL files to be installed. The PolarRoute software can be installed on Windows by running one of the two following commands.
+Installing PolarRoute
+#####################
+
+The PolarRoute software can be installed on Windows/Linux/MacOS by running one of the two following commands.
Github:
::
-
- pip install pipwin
- pipwin install gdal
- pipwin install fiona
+
+ git clone https://https://github.com/antarctica/PolarRoute.git
python setup.py install
Pip:
::
- pip install pipwin
+ pip install polar-route
+
+
+Installing GDAL
+###############
+
+The PolarRoute software has GDAL as an optional requirement. It is only used when exporting TIFF images,
+so if this is not useful to you, we would recommend steering clear. It is not trivial and is a common source of problems.
+With that said, below are instructions for various operating systems.
+
+Windows
+*******
+
+.. note::
+ We assume a version of Windows 10 or higher, with a working version of Python 3.9 including pip installed.
+ We recommend installing PolarRoute into a virtual environment.
+
+Windows:
+
+::
+
+ pip install pipwin # pipwin is a package that allows for easy installation of windows binaries
pipwin install gdal
pipwin install fiona
- pip install polar-route
Linux/MacOS
-###########
+***********
-The PolarRoute software can be installed on Linux/MacOS by running one of the two following commands.
+Ubuntu/Debian:
-Github:
::
+
+ sudo add-apt-repository ppa:ubuntugis/ppa
+ sudo apt-get update
+ sudo apt-get install gdal-bin libgdal-dev
+ export CPLUS_INCLUDE_PATH=/usr/include/gdal
+ export C_INCLUDE_PATH=/usr/include/gdal
+ pip install GDAL==$(gdal-config --version)
- python setup.py install
-Pip:
+Fedora:
+
::
- pip install polar-route
+ sudo dnf update
+ sudo dnf install gdal gdal-devel
+ export CPLUS_INCLUDE_PATH=/usr/include/gdal
+ export C_INCLUDE_PATH=/usr/include/gdal
+ pip install GDAL==$(gdal-config --version)
+
+
+MacOS (with HomeBrew):
+
+::
+
+ brew install gdal --HEAD
+ brew install gdal
+ pip install GDAL==$(gdal-config --version)
\ No newline at end of file
diff --git a/docs/html/_sources/sections/Outputs.rst.txt b/docs/html/_sources/sections/Outputs.rst.txt
index 75a332ae..4b3ba9f3 100644
--- a/docs/html/_sources/sections/Outputs.rst.txt
+++ b/docs/html/_sources/sections/Outputs.rst.txt
@@ -4,9 +4,9 @@
Outputs - Data Types
********************
-#################
-Mesh construction
-#################
+######################
+The Mesh.json file
+######################
The first stage in the route planning pipeline is constructing a discrete
mesh of the environment in which the route planner can operate. Once this
@@ -26,7 +26,7 @@ of mesh construction and json object generation are as follows:
.. note::
Examples and a description of the configuration files can be found in
- the :ref:`Configuration` section of this document.
+ the :ref:`configuration - mesh construction` section of this document.
The json object outputted by the Mesh consists of 3 sections: **config**,
@@ -137,12 +137,13 @@ where each of the values represent the following:
:align: center
:width: 700
-#################
-Vehicle specifics
-#################
+###########################
+The Vessel_mesh.json file
+###########################
Once a discrete mesh environment is contracted, it is then passed to the vessel performance modeller
-which applies transformations which are specific to a given vehicle.
+which applies transformations which are specific to a given vehicle. These vehicle specific values
+are then encoded into the mesh json object and passed down-stream to the route planner.
::
@@ -162,7 +163,8 @@ which applies transformations which are specific to a given vehicle.
.. note::
To make use of the full range of vessel performance transformations, a Mesh should be constructed with
the following attributes:
-
+
+ * elevation (available via data_loaders: *gebco*, *bsose_depth*)
* SIC (available via data_loaders: *amsr*, *bsose_sic*, *baltic_sic*, *icenet*, *modis*)
* thickness (available via data_loaders: *thickness*)
* density (available via data_loaders: *density*)
@@ -189,17 +191,20 @@ have a set of new attributes as follows:
* **relative wind angle** *(list)* : The angle of the apparent wind acting on the vessel.
-##############
-Route planning
-##############
+#########################
+The Route.json file
+#########################
-During the route planning stage of the pipline information on the routes and the waypoints used are saved as outputs to the processing stage. Descriptions of the structure of the two outputs are given below:
+During the route planning stage of the pipline information on the routes and the waypoints used are saved
+as outputs to the processing stage. Descriptions of the structure of the two outputs are given below:
=========
waypoints
=========
-An entry in the json including all the information of the waypoints defined by the user from the `waypoints_path` file. It may be the case that ot all waypoints would have been used in the route construction, but all waypoints are returned to this entry. The structure of the entry follows:
+An entry in the json including all the information of the waypoints defined by the user from the `waypoints_path`
+file. It may be the case that ot all waypoints would have been used in the route construction, but all waypoints
+are returned to this entry. The structure of the entry follows:
::
@@ -287,7 +292,7 @@ where the output takes a GeoJSON standard form (more infor given at https://geoj
* **** : A list of the features representing each of the separate routes constructed
* **geometry** : The positioning of the route locations
* **coordinates** : A list of the Lat,Long position of all the route points
- * **** : A list of metainformation about the route
+ * **** : A list of meta-information about the route
* **from** : Starting waypoint of route
* **to** : Ending waypoint of route
* **traveltime** : A list of float values representing the cumulative travel time along the route. This entry was originally defined as a output in the configuration file by the `path_variables` definition.
diff --git a/docs/html/_sources/sections/Route_optimisation.rst.txt b/docs/html/_sources/sections/Route_optimisation.rst.txt
index dfa26953..d9f4f1ee 100644
--- a/docs/html/_sources/sections/Route_optimisation.rst.txt
+++ b/docs/html/_sources/sections/Route_optimisation.rst.txt
@@ -5,26 +5,32 @@ Methods - Route Planner
Route Optimisation Overview
###########################
-In this section we will outline the construction of the route paths using the Mesh construction corrected to include the objective functions define and generated in the earlier section.
+In this section we outline the code used to generate optimal routes through a mesh constructed by the methods described
+in previous sections. This mesh should include the vessel performance parameters with respect to which objective
+functions can be defined for optimisation.
+Route Optimisation Modules
+##########################
Route Planner
-#############
+*************
.. automodule:: polar_route.route_planner
:members:
+ :private-members:
-.. Crossing Points
-.. ##############
+Crossing Points
+***************
-.. .. automodule:: polar_route.crossing
-.. :members:
-
-.. .. autoclass:: polar_route.crossing.NewtonianDistance
-.. :members:
+.. automodule:: polar_route.crossing
+ :members:
+ :private-members:
-.. .. autoclass:: polar_route.crossing.NewtonianCurve
-.. :members:
+Crossing Point Smoothing
+************************
+.. automodule:: polar_route.crossing_smoothing
+ :members:
+ :private-members:
diff --git a/docs/html/_sources/sections/running.rst.txt b/docs/html/_sources/sections/running.rst.txt
deleted file mode 100644
index 314f74c8..00000000
--- a/docs/html/_sources/sections/running.rst.txt
+++ /dev/null
@@ -1,242 +0,0 @@
-**********************
-Runnning Codebase
-**********************
-
-The codebase current can be run from either pre-defined python functions or via a command line interface. Outlined below is how to run the separate sections of the software package using either of the two methods.
-
-
-Command Line Interface
-###############################
-
-The PolarRoute package provides 3 CLI entry points, intended to be used in succession to plan a route through a digital enviroment.
-
-.. figure:: ./Figures/PolarRoute_CLI.png
- :align: center
- :width: 700
-
- *Overview figure of the Command Line Interface entry points of PolarRoute*
-
-^^^^^^^^^^^^^^^^^^
-create_mesh
-^^^^^^^^^^^^^^^^^^
-The *create_mesh* entry point builds a digital enviroment file from a collection of source data, which can then be used
-by the vessel performance modeller and route planner.
-
-::
-
- create_mesh
-
-positional arguments:
-
-::
-
- config : A configuration file detailing how to build the digital enviroment. JSON parsable
-
-The format of the required ** file can be found in the :ref:`Input - Configuration` section of the documentation.
-
-optional arguments:
-
-::
-
- -v (verbose logging)
- -o