Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Update documentation and warnings before 0.1 release #502

Merged
merged 57 commits into from
Nov 7, 2024
Merged
Changes from 1 commit
Commits
Show all changes
57 commits
Select commit Hold shift + click to select a range
915fd1c
Incremental commit on streamlined doc and user warnings
rhugonnet Apr 8, 2024
2085bdb
Merge remote-tracking branch 'upstream/main' into towards_0.1
rhugonnet Apr 28, 2024
526904c
Incremental commit on documentation
rhugonnet Apr 30, 2024
1968f37
Incremental commit on documentation
rhugonnet Apr 30, 2024
507007f
Incremental commit for doc
rhugonnet May 6, 2024
ea8fb99
Homogenize contributor
rhugonnet May 6, 2024
1260677
Incremental commit on doc
rhugonnet May 7, 2024
dc8469d
Merge remote-tracking branch 'upstream/main' into towards_0.1
rhugonnet May 13, 2024
e230033
Linting
rhugonnet May 13, 2024
fe1c9f0
Incremental commit
rhugonnet May 17, 2024
461d61e
Incremental commit
rhugonnet May 22, 2024
9d657ea
Merge remote-tracking branch 'upstream/main' into towards_0.1
rhugonnet May 23, 2024
5c862c6
Show more toc levels
rhugonnet Jun 14, 2024
84d517e
Merge remote-tracking branch 'upstream/main' into towards_0.1
rhugonnet Jun 15, 2024
22bb334
Add custom css rule for toggle buttons
rhugonnet Jun 16, 2024
1086e0c
Merge remote-tracking branch 'upstream/main' into towards_0.1
rhugonnet Sep 5, 2024
767c440
Incremental commit on doc
rhugonnet Sep 7, 2024
b508171
Incremental commit on doc
rhugonnet Sep 10, 2024
35cf9dd
Add table for coregistration methods
rhugonnet Sep 11, 2024
dfeab70
Merge remote-tracking branch 'upstream/main' into towards_0.1
rhugonnet Sep 11, 2024
254833a
Add timeout to terrain and finalize coreg
rhugonnet Sep 12, 2024
9f9b05d
Incremental commit on doc
rhugonnet Sep 19, 2024
32b6ecf
Incremental commit on doc
rhugonnet Oct 2, 2024
2893605
Finalize new uncertainty page
rhugonnet Oct 5, 2024
71718c0
Remove blockwise coreg example temporarily
rhugonnet Oct 5, 2024
caf2f33
Add pipeline info()
rhugonnet Oct 5, 2024
20b3c4f
Eriks comments
rhugonnet Oct 6, 2024
4633ac9
Merge remote-tracking branch 'upstream/main' into towards_0.1
rhugonnet Oct 6, 2024
0f29bd1
Linting
rhugonnet Oct 6, 2024
7e2dc97
Fixes for tests
rhugonnet Oct 6, 2024
4e4bfd0
Fix directional bias example
rhugonnet Oct 7, 2024
073f494
Incremental commit on doc
rhugonnet Oct 22, 2024
177db6d
Almost there!
rhugonnet Oct 24, 2024
1f6d9bb
Linting
rhugonnet Oct 24, 2024
584fa09
Merge remote-tracking branch 'upstream/main' into towards_0.1
rhugonnet Oct 24, 2024
8c1ed70
Reduce build time of documentation
rhugonnet Oct 25, 2024
4aff93f
Add citation and use in publication, streamline old gallery examples
rhugonnet Oct 26, 2024
1831031
Merge remote-tracking branch 'upstream/main' into towards_0.1
rhugonnet Oct 28, 2024
2863453
Fix errors in documentation
rhugonnet Oct 28, 2024
9c6095c
Linting
rhugonnet Oct 28, 2024
eb715a1
Incremental commit on doc
rhugonnet Oct 29, 2024
a4bb1c2
Merge remote-tracking branch 'upstream/main' into towards_0.1
rhugonnet Oct 29, 2024
dbd791d
Incremental commit on doc
rhugonnet Oct 30, 2024
b1f783a
Linting
rhugonnet Oct 30, 2024
e33a2a5
Last streamlining for loggging
rhugonnet Oct 30, 2024
3fa4e88
Linting
rhugonnet Oct 30, 2024
746f961
Merge remote-tracking branch 'upstream/main' into towards_0.1
rhugonnet Oct 31, 2024
c368f74
Fix test_info with new as_str argument
rhugonnet Oct 31, 2024
e79d93e
Try vertical CRS transformation with grid for cheatsheet (running out…
rhugonnet Oct 31, 2024
a7149d7
Revert cheatsheet vertical CRRS code...
rhugonnet Oct 31, 2024
b81b14e
Comments from Amaury and Erik
rhugonnet Nov 7, 2024
3b4d9f4
Linting
rhugonnet Nov 7, 2024
5e6f08b
Last changes
rhugonnet Nov 7, 2024
1c15c37
Linting
rhugonnet Nov 7, 2024
9b401bb
Bump cache number
rhugonnet Nov 7, 2024
53b832d
Remove windows temporarily
rhugonnet Nov 7, 2024
c3cbc37
Modify linear to idw everywhere
rhugonnet Nov 7, 2024
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Next Next commit
Incremental commit on streamlined doc and user warnings
rhugonnet committed Apr 8, 2024

Verified

This commit was signed with the committer’s verified signature.
tisonkun tison
commit 915fd1c831b4c51a32dd95faccac6c0049fc92ed
9 changes: 6 additions & 3 deletions doc/source/about_xdem.md
Original file line number Diff line number Diff line change
@@ -2,12 +2,15 @@

# About xDEM

## What is xDEM?

xDEM is a [Python](https://www.python.org/) package for the analysis of DEMs, with name standing for _cross-DEM analysis_[^sn1]
and echoing its dependency on [xarray](https://docs.xarray.dev/en/stable/). It is designed for all Earth and planetary
observation science, although our group currently has a strong focus on glaciological applications.
and echoing its dependency on [Xarray](https://docs.xarray.dev/en/stable/). It is designed for all Earth and planetary
observation science.

[^sn1]: The core features of xDEM rely on cross-analysis of surface elevation, for example for DEM alignment or error analysis.
[^sn1]: The core features of xDEM rely on cross-analysis of surface elevation, for example for DEM alignment or uncertainty analysis.

## Why use GeoUtils?


```{epigraph}
Original file line number Diff line number Diff line change
@@ -1,6 +1,6 @@
(intro)=
(accuracy-precision)=

# Analysis of accuracy and precision
# Understanding accuracy and precision

Digital Elevation Models are numerical, gridded representations of elevation. They are generated from different
instruments (e.g., optical sensors, radar, lidar), acquired in different conditions (e.g., ground, airborne, satellite)
@@ -109,4 +109,4 @@ The tools for quantifying DEM precision are described in {ref}`spatialstats`.
```{eval-rst}
.. minigallery:: xdem.spatialstats.infer_heteroscedasticity_from_stable xdem.spatialstats.get_variogram_model_func xdem.spatialstats.sample_empirical_variogram
:add-heading: Examples that use spatial statistics functions
```
```
67 changes: 67 additions & 0 deletions doc/source/background.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,67 @@
(background)=

# Background

## Mission

```{epigraph}
The core mission of xDEM is to be **easy-of-use**, **modular**, **robust**, **reproducible** and **fully open**.

Additionally, xDEM aims to be **efficient**, **scalable** and **state-of-the-art**.
```

```{important}
:class: margin
xDEM is in early stages of development and its features might evolve rapidly. Note the version you are working on for
**reproducibility**!
We are working on making features fully consistent for the first long-term release ``v0.1`` (planned early 2024).
```

In details, those mean:

- **Ease-of-use:** all DEM basic operations or methods from published works should only require a few lines of code to be performed;

- **Modularity:** all DEM methods should be fully customizable, to allow both flexibility and inter-comparison;

- **Robustness:** all DEM methods should be tested within our continuous integration test-suite, to enforce that they always perform as expected;

- **Reproducibility:** all code should be version-controlled and release-based, to ensure consistency of dependent
packages and works;

- **Open-source:** all code should be accessible and re-usable to anyone in the community, for transparency and open governance.

```{note}
:class: margin
Additional mission points, in particular **scalability**, are partly developed but not a priority until our first long-term release ``v0.1`` is reached. Those will be further developed specifically in a subsequent version ``v0.2``.
```

And, additionally:

- **Efficiency**: all methods should be optimized at the lower-level, to function with the highest performance offered by Python packages;

- **Scalability**: all methods should support both lazy processing and distributed parallelized processing, to work with high-resolution data on local machines as well as on HPCs;

- **State-of-the-art**: all methods should be at the cutting edge of remote sensing science, to provide users with the most reliable and up-to-date tools.


## The people behind xDEM

```{margin}
<sup>2</sup>More on our GlacioHack founder at [adehecq.github.io](https://adehecq.github.io/)!
```

xDEM was created during the [GlacioHack](https://github.com/GlacioHack) hackaton event, that was initiated by
Amaury Dehecq<sup>2</sup> and took place online on November 8, 2020.

```{margin}
<sup>3</sup>Check-out [glaciology.ch](https://glaciology.ch) on our founding group of VAW glaciology!
```

The initial core development of xDEM was performed by members of the Glaciology group of the Laboratory of Hydraulics, Hydrology and
Glaciology (VAW) at ETH Zürich<sup>3</sup>, with contributions by members of the University of Oslo, the University of Washington, and University
Grenobles Alpes.

We are not software developers but geoscientists, and we try our best to offer tools that can be useful to a larger group,
documented, reliable and maintained. All development and maintenance is made on a voluntary basis and we welcome
any new contributors. See some information on how to contribute in the dedicated page of our
[GitHub repository](https://github.com/GlacioHack/xdem/blob/main/CONTRIBUTING.md).
3 changes: 3 additions & 0 deletions doc/source/cheatsheet.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,3 @@
(cheatsheet)=

# Cheatsheet: How to correct... ?
14 changes: 14 additions & 0 deletions doc/source/elevation_intricacies.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,14 @@
(elevation-intricacies)=
# Elevation data and its intricacies

## Types of elevation data

DEM, dense elevation point cloud, sparse elevation point cloud

## A third dimension to deal with

Vertical referencing, misalignments

## The interpretation of pixel value and data formats

Pixel interpretation, netCDF vs raster formats
12 changes: 7 additions & 5 deletions doc/source/index.md
Original file line number Diff line number Diff line change
@@ -87,11 +87,12 @@ quick_start
```

```{toctree}
:caption: Background
:caption: Resources
:maxdepth: 2

intro_robuststats
intro_accuracy_precision
elevation_intricacies
stats_for_elevation
cheatsheet
```

```{toctree}
@@ -116,10 +117,11 @@ advanced_examples/index.rst
```

```{toctree}
:caption: API Reference
:caption: API reference
:maxdepth: 2

api.rst
api
background
```

# Indices and tables
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
(robuststats)=
(robust-stats)=

# The need for robust statistics

@@ -107,4 +107,4 @@ The {ref}`coregistration` and {ref}`biascorr` methods encapsulate some of those

- The Random sample consensus estimator [RANSAC](https://en.wikipedia.org/wiki/Random_sample_consensus),
- The [Theil-Sen](https://en.wikipedia.org/wiki/Theil%E2%80%93Sen_estimator) estimator,
- The [Huber loss](https://en.wikipedia.org/wiki/Huber_loss) estimator.
- The [Huber loss](https://en.wikipedia.org/wiki/Huber_loss) estimator.
10 changes: 10 additions & 0 deletions doc/source/stats_for_elevation.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,10 @@
(stats-for-elevation)=
# Statistics for elevation data


```{toctree}
:maxdepth: 2

robust_stats
accuracy_precision
```
2 changes: 1 addition & 1 deletion xdem/coreg/base.py
Original file line number Diff line number Diff line change
@@ -167,7 +167,7 @@ def _df_sampling_from_dem(
elif subsample > 1:
npoints = int(subsample)
else:
raise ValueError("`subsample` must be > 0")
raise ValueError("Argument `subsample` must be > 0.")

# Avoid edge, and mask-out area in sampling
width, length = dem.shape
26 changes: 13 additions & 13 deletions xdem/coreg/workflows.py
Original file line number Diff line number Diff line change
@@ -55,14 +55,14 @@ def create_inlier_mask(
# - Sanity check on inputs - #
# Check correct input type of shp_list
if not isinstance(shp_list, (list, tuple)):
raise ValueError("`shp_list` must be a list/tuple")
raise ValueError("Argument `shp_list` must be a list/tuple.")
for el in shp_list:
if not isinstance(el, (str, gu.Vector)):
raise ValueError("`shp_list` must be a list/tuple of strings or geoutils.Vector instance")
raise ValueError("Argument `shp_list` must be a list/tuple of strings or geoutils.Vector instance.")

# Check correct input type of inout
if not isinstance(inout, (list, tuple)):
raise ValueError("`inout` must be a list/tuple")
raise ValueError("Argument `inout` must be a list/tuple.")

if len(shp_list) > 0:
if len(inout) == 0:
@@ -72,18 +72,18 @@ def create_inlier_mask(
# Check that inout contains only 1 and -1
not_valid = [el for el in np.unique(inout) if ((el != 1) & (el != -1))]
if len(not_valid) > 0:
raise ValueError("`inout` must contain only 1 and -1")
raise ValueError("Argument `inout` must contain only 1 and -1.")
else:
raise ValueError("`inout` must be of same length as shp")
raise ValueError("Argument `inout` must be of same length as shp.")

# Check slope_lim type
if not isinstance(slope_lim, (list, tuple)):
raise ValueError("`slope_lim` must be a list/tuple")
raise ValueError("Argument `slope_lim` must be a list/tuple.")
if len(slope_lim) != 2:
raise ValueError("`slope_lim` must contain 2 elements")
raise ValueError("Argument `slope_lim` must contain 2 elements.")
for el in slope_lim:
if (not isinstance(el, (int, float, np.integer, np.floating))) or (el < 0) or (el > 90):
raise ValueError("`slope_lim` must be a tuple/list of 2 elements in the range [0-90]")
raise ValueError("Argument `slope_lim` must be a tuple/list of 2 elements in the range [0-90].")

# Initialize inlier_mask with no masked pixel
inlier_mask = np.ones(src_dem.data.shape, dtype="bool")
@@ -176,25 +176,25 @@ def dem_coregistration(
"""
# Check inputs
if not isinstance(coreg_method, Coreg):
raise ValueError("`coreg_method` must be an xdem.coreg instance (e.g. xdem.coreg.NuthKaab())")
raise ValueError("Argument `coreg_method` must be an xdem.coreg instance (e.g. xdem.coreg.NuthKaab()).")

if isinstance(ref_dem_path, str):
if not isinstance(src_dem_path, str):
raise ValueError(
f"`ref_dem_path` is string but `src_dem_path` has type {type(src_dem_path)}."
f"Argument `ref_dem_path` is string but `src_dem_path` has type {type(src_dem_path)}."
"Both must have same type."
)
elif isinstance(ref_dem_path, gu.Raster):
if not isinstance(src_dem_path, gu.Raster):
raise ValueError(
f"`ref_dem_path` is of Raster type but `src_dem_path` has type {type(src_dem_path)}."
f"Argument `ref_dem_path` is of Raster type but `src_dem_path` has type {type(src_dem_path)}."
"Both must have same type."
)
else:
raise ValueError("`ref_dem_path` must be either a string or a Raster")
raise ValueError("Argument `ref_dem_path` must be either a string or a Raster.")

if grid not in ["ref", "src"]:
raise ValueError(f"`grid` must be either 'ref' or 'src' - currently set to {grid}")
raise ValueError(f"Argument `grid` must be either 'ref' or 'src' - currently set to {grid}.")

# Load both DEMs
if verbose:
3 changes: 2 additions & 1 deletion xdem/dem.py
Original file line number Diff line number Diff line change
@@ -110,7 +110,8 @@ def __init__(

# Ensure DEM has only one band: self.bands can be None when data is not loaded through the Raster class
if self.bands is not None and len(self.bands) > 1:
raise ValueError("DEM rasters should be composed of one band only")
raise ValueError("DEM rasters should be composed of one band only. Either use argument `bands` to specify "
"a single band on opening, or use .split_bands() on an opened raster.")

# If the CRS in the raster metadata has a 3rd dimension, could set it as a vertical reference
vcrs_from_crs = _vcrs_from_crs(CRS(self.crs))
7 changes: 4 additions & 3 deletions xdem/demcollection.py
Original file line number Diff line number Diff line change
@@ -36,7 +36,8 @@ def __init__(
if timestamps is None:
timestamp_attributes = [dem.datetime for dem in dems]
if any(stamp is None for stamp in timestamp_attributes):
raise ValueError("'timestamps' not provided and the given DEMs do not all have datetime attributes")
raise ValueError("Argument `timestamps` not provided and the given DEMs do not all have datetime "
"attributes")

timestamps = timestamp_attributes

@@ -183,7 +184,7 @@ def get_dh_series(
:returns: A dataframe of dH values and respective areas with an Interval[Timestamp] index.
"""
if len(self.ddems) == 0:
raise ValueError("dDEMs have not yet been calculated")
raise ValueError("dDEMs have not yet been calculated.")

dh_values = pd.DataFrame(columns=["dh", "area"], dtype=float)
for _, ddem in enumerate(self.ddems):
@@ -249,7 +250,7 @@ def get_cumulative_series(
# Get the dV series (where all indices are: "year to reference_year")
d_series = self.get_dv_series(mask=mask, outlines_filter=outlines_filter, nans_ok=nans_ok)
else:
raise ValueError("Invalid argument: '{dh=}'. Choices: ['dh', 'dv']")
raise ValueError("Invalid argument: '{dh=}'. Choices: ['dh', 'dv'].")

# Simplify the index to just "year" (implicitly still the same as above)
cumulative_dh = pd.Series(dtype=d_series.dtype)
2 changes: 1 addition & 1 deletion xdem/examples.py
Original file line number Diff line number Diff line change
@@ -61,7 +61,7 @@ def download_longyearbyen_examples(overwrite: bool = False) -> None:
with open(tar_path, "wb") as outfile:
outfile.write(response.read())
else:
raise ValueError(f"Longyearbyen data fetch gave non-200 response: {response.status_code}")
raise ValueError(f"Longyearbyen data fetch gave non-200 response: {response.status_code}.")

# Extract the tarball
with tarfile.open(tar_path) as tar:
8 changes: 4 additions & 4 deletions xdem/filters.py
Original file line number Diff line number Diff line change
@@ -29,7 +29,7 @@ def gaussian_filter_scipy(array: NDArrayf, sigma: float) -> NDArrayf:
"""
# Check that array dimension is 2 or 3
if np.ndim(array) not in [2, 3]:
raise ValueError(f"Invalid array shape given: {array.shape}. Expected 2D or 3D array")
raise ValueError(f"Invalid array shape given: {array.shape}. Expected 2D or 3D array.")

# In case array does not contain NaNs, use scipy's gaussian filter directly
if np.count_nonzero(np.isnan(array)) == 0:
@@ -71,7 +71,7 @@ def gaussian_filter_cv(array: NDArrayf, sigma: float) -> NDArrayf:
:returns: the filtered array (same shape as input)
"""
if not _has_cv2:
raise ValueError("Optional dependency needed. Install 'opencv'")
raise ValueError("Optional dependency needed. Install 'opencv'.")

# Check that array dimension is 2, or can be squeezed to 2D
orig_shape = array.shape
@@ -81,9 +81,9 @@ def gaussian_filter_cv(array: NDArrayf, sigma: float) -> NDArrayf:
if orig_shape[0] == 1:
array = array.squeeze()
else:
raise NotImplementedError("Case of array of dimension 3 not implemented")
raise NotImplementedError("Case of array of dimension 3 not implemented.")
else:
raise ValueError(f"Invalid array shape given: {orig_shape}. Expected 2D or 3D array")
raise ValueError(f"Invalid array shape given: {orig_shape}. Expected 2D or 3D array.")

# In case array does not contain NaNs, use OpenCV's gaussian filter directly
# With kernel size (0, 0), i.e. set to default, and borderType=BORDER_REFLECT, the output is equivalent to scipy
6 changes: 3 additions & 3 deletions xdem/fit.py
Original file line number Diff line number Diff line change
@@ -361,9 +361,9 @@ def robust_norder_polynomial_fit(

# Raise errors for input string parameters
if not isinstance(estimator_name, str) or estimator_name not in ["Linear", "Theil-Sen", "RANSAC", "Huber"]:
raise ValueError('Attribute estimator must be one of "Linear", "Theil-Sen", "RANSAC" or "Huber".')
raise ValueError('Attribute `estimator` must be one of "Linear", "Theil-Sen", "RANSAC" or "Huber".')
if not isinstance(linear_pkg, str) or linear_pkg not in ["sklearn", "scipy"]:
raise ValueError('Attribute linear_pkg must be one of "scipy" or "sklearn".')
raise ValueError('Attribute `linear_pkg` must be one of "scipy" or "sklearn".')

# Extract xdata from iterable
if len(xdata) == 1:
@@ -404,7 +404,7 @@ def robust_norder_polynomial_fit(
else:
# Otherwise, we use sklearn
if not _has_sklearn:
raise ValueError("Optional dependency needed. Install 'scikit-learn'")
raise ValueError("Optional dependency needed. Install 'scikit-learn'.")

# Define the polynomial model to insert in the pipeline
model = PolynomialFeatures(degree=deg)
4 changes: 2 additions & 2 deletions xdem/misc.py
Original file line number Diff line number Diff line change
@@ -47,7 +47,7 @@ def generate_random_field(shape: tuple[int, int], corr_size: int) -> NDArrayf:
"""

if not _has_cv2:
raise ValueError("Optional dependency needed. Install 'opencv'")
raise ValueError("Optional dependency needed. Install 'opencv'.")

field = cv2.resize(
cv2.GaussianBlur(
@@ -191,7 +191,7 @@ def diff_environment_yml(
"""

if not _has_yaml:
raise ValueError("Test dependency needed. Install 'pyyaml'")
raise ValueError("Test dependency needed. Install 'pyyaml'.")

if not input_dict:
# Load the yml as dictionaries
8 changes: 4 additions & 4 deletions xdem/terrain.py
Original file line number Diff line number Diff line change
@@ -321,23 +321,23 @@ def get_quadric_coefficients(
if len(dem_arr.shape) != 2:
raise ValueError(
f"Invalid input array shape: {dem.shape}, parsed into {dem_arr.shape}. "
"Expected 2D array or 3D array of shape (1, row, col)"
"Expected 2D array or 3D array of shape (1, row, col)."
)

if any(dim < 3 for dim in dem_arr.shape):
raise ValueError(f"DEM (shape: {dem.shape}) is too small. Smallest supported shape is (3, 3)")
raise ValueError(f"DEM (shape: {dem.shape}) is too small. Smallest supported shape is (3, 3).")

# Resolution is in other tools accepted as a tuple. Here, it must be just one number, so it's best to sanity check.
if isinstance(resolution, Sized):
raise ValueError("Resolution must be the same for X and Y directions")
raise ValueError("Resolution must be the same for X and Y directions.")

allowed_fill_methods = ["median", "mean", "none"]
allowed_edge_methods = ["nearest", "wrap", "none"]
for value, name, allowed in zip(
[fill_method, edge_method], ["fill", "edge"], (allowed_fill_methods, allowed_edge_methods)
):
if value.lower() not in allowed:
raise ValueError(f"Invalid {name} method: '{value}'. Choices: {allowed}")
raise ValueError(f"Invalid {name} method: '{value}'. Choices: {allowed}.")

# Try to run the numba JIT code. It should never fail at this point, so if it does, it should be reported!
try:
6 changes: 3 additions & 3 deletions xdem/volume.py
Original file line number Diff line number Diff line change
@@ -77,7 +77,7 @@ def hypsometric_binning(
elif kind == "custom":
zbins = bins # type: ignore
else:
raise ValueError(f"Invalid bin kind: {kind}. Choices: ['fixed', 'count', 'quantile', 'custom']")
raise ValueError(f"Invalid bin kind: {kind}. Choices: ['fixed', 'count', 'quantile', 'custom'].")

# Generate bins and get bin indices from the mean DEM
indices = np.digitize(ref_dem, bins=zbins)
@@ -248,7 +248,7 @@ def calculate_hypsometry_area(
assert not np.any(np.isnan(ref_dem)), "The given reference DEM has NaNs. No NaNs are allowed to calculate area!"

if timeframe not in ["reference", "nonreference", "mean"]:
raise ValueError(f"Argument 'timeframe={timeframe}' is invalid. Choices: ['reference', 'nonreference', 'mean']")
raise ValueError(f"Argument 'timeframe={timeframe}' is invalid. Choices: ['reference', 'nonreference', 'mean'].")

if isinstance(ddem_bins, pd.DataFrame):
ddem_bins = ddem_bins["value"]
@@ -301,7 +301,7 @@ def linear_interpolation(
:returns: A filled array with no NaNs
"""
if not _has_cv2:
raise ValueError("Optional dependency needed. Install 'opencv'")
raise ValueError("Optional dependency needed. Install 'opencv'.")

# Create a mask for where nans exist
nan_mask = get_mask_from_array(array)