Skip to content

Commit

Permalink
Fix Sphinx warnings related to whitespace (#197)
Browse files Browse the repository at this point in the history
* Ensure API docs are generated to gitignored doc/source/_icepyx folder
* Fix whitespace issues in *.rst files
* Fix whitespace issues in Earthdata.py and variables.py
Adding or removing spaces so that Sphinx stops complaining like "WARNING: Explicit markup ends without a blank line".
  • Loading branch information
weiji14 authored Apr 12, 2021
1 parent c4439e4 commit a582a27
Show file tree
Hide file tree
Showing 7 changed files with 75 additions and 65 deletions.
13 changes: 7 additions & 6 deletions ATTRIBUTION.rst
Original file line number Diff line number Diff line change
@@ -1,14 +1,15 @@
.. _attribution_ref_label:

Attribution Guidelines
======================

We are extremely grateful to everyone who has contributed to the success of the icepyx community, whether through direct contributions to or feedback about icepyx or as developers or maintainers of complimentary resources that are included within the icepyx ecosystem. This document outlines our goals to give appropriate attribution to all contributors to icepyx in ways that are fair and diverse and supportive of professional goals. To do so, we define broadly *contributions* as:

Efforts towards achieving icepyx's goals, including writing code, tests, or documentation,
development of example workflows, development, significant contributions, or maintenance of
a tailored package that broadens the functionality of icepyx, feedback and suggestions,
development of example workflows, development, significant contributions, or maintenance of
a tailored package that broadens the functionality of icepyx, feedback and suggestions,
community building, etc.

We use the terms "contributors", "developers", and "authors" interchangeably. We will recognize contributions in the following ways.

Contributors List
Expand Down Expand Up @@ -39,11 +40,11 @@ Authorship on scientific papers currently constitutes an important metric for as
2. Add themself to the `Contributors List`_.
3. Contribute ideas, participate in authorship discussions (see next paragraph), write, read, and review the manuscript in a timely manner, and provide feedback (acknowledgement of review is sufficient, but we'd prefer more).

Author order will be determined based on co-author discussion, led by the lead author, during the initial planning stages of manuscript preparation (i.e. as soon as an idea matures into a potential manuscript and before writing begins). Authorship will continue to be evaluated throughout the manuscript preparation process. Discussions will consider authorship norms (e.g. How does author order convey participation and prestige? How critical is first authorship to career advancement for each member of the team? Do an individual's contributions meet authorship criteria or are they more suited to acknowledgements?). Author order determination will also consider metrics such as the number of commits since the last major release with an associated paper (``git shortlog vX.0.0...HEAD -sne``), contributions that do not have associated commits, and contributions to the preparation of the manuscript.
Author order will be determined based on co-author discussion, led by the lead author, during the initial planning stages of manuscript preparation (i.e. as soon as an idea matures into a potential manuscript and before writing begins). Authorship will continue to be evaluated throughout the manuscript preparation process. Discussions will consider authorship norms (e.g. How does author order convey participation and prestige? How critical is first authorship to career advancement for each member of the team? Do an individual's contributions meet authorship criteria or are they more suited to acknowledgements?). Author order determination will also consider metrics such as the number of commits since the last major release with an associated paper (``git shortlog vX.0.0...HEAD -sne``), contributions that do not have associated commits, and contributions to the preparation of the manuscript.



Disclaimer: These policies are not permanent or fixed and may change to accomodate community growth,
Disclaimer: These policies are not permanent or fixed and may change to accomodate community growth,
best practices, and feedback.

Copyright notice: This document was inspired by the `authorship guidelines <https://github.com/fatiando/contributing/blob/master/AUTHORSHIP.md>`_ provided by `Fatiando a Terra <https://github.com/fatiando>`_ and encourages potential co-authors to consider the resources provided by the `NASA High Mountain Asia Team (HiMAT) <https://highmountainasia.github.io/team-collaboration/authorship/>`_.
Copyright notice: This document was inspired by the `authorship guidelines <https://github.com/fatiando/contributing/blob/master/AUTHORSHIP.md>`_ provided by `Fatiando a Terra <https://github.com/fatiando>`_ and encourages potential co-authors to consider the resources provided by the `NASA High Mountain Asia Team (HiMAT) <https://highmountainasia.github.io/team-collaboration/authorship/>`_.
3 changes: 2 additions & 1 deletion doc/source/community/contact.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
.. _contact_ref_label:

Contact Us
==========

Expand All @@ -20,4 +21,4 @@ We meet on:

Additional information about logging in to the meetings can be found on `this Discourse post <https://discourse.pangeo.io/t/icepyx-team-meetings/722/2?u=jessicas11>`_.

Absolutely NO previous software development experience is necessary to attend any meeting. Think of them more like coffee hour mixed with office hours than a conference call. We look forward to seeing you there!
Absolutely NO previous software development experience is necessary to attend any meeting. Think of them more like coffee hour mixed with office hours than a conference call. We look forward to seeing you there!
1 change: 1 addition & 0 deletions doc/source/community/resources.rst
Original file line number Diff line number Diff line change
@@ -1,4 +1,5 @@
.. _resource_ref_label:

ICESat-2 Open-Source Resources Guide
====================================

Expand Down
3 changes: 1 addition & 2 deletions doc/source/user_guide/documentation/icepyx.rst
Original file line number Diff line number Diff line change
@@ -1,11 +1,10 @@
icepyx Documentation (API Reference)
====================================



.. image:: icepyx_class_diagram.png
:width: 600
:alt: PlantUML Class Diagram illustrating the public-facing classes within icepyx, their attributes and methods, their relationships (e.g. component classes).

icepyx class diagram illustrating the library's public-facing classes, their attributes and methods, and their relationships.


Expand Down
6 changes: 3 additions & 3 deletions doc/source/user_guide/documentation/query.rst
Original file line number Diff line number Diff line change
Expand Up @@ -8,7 +8,7 @@ Constructor
-----------

.. autosummary::
:toctree: ../_icepyx/
:toctree: ../../_icepyx/

Query

Expand All @@ -17,7 +17,7 @@ Attributes
----------

.. autosummary::
:toctree: ../_icepyx/
:toctree: ../../_icepyx/

Query.CMRparams
Query.cycles
Expand All @@ -38,7 +38,7 @@ Methods
-------

.. autosummary::
:toctree: ../_icepyx/
:toctree: ../../_icepyx/

Query.avail_granules
Query.dataset_all_info
Expand Down
12 changes: 10 additions & 2 deletions icepyx/core/Earthdata.py
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,11 @@ class Earthdata:
"""

def __init__(
self, uid, email, capability_url, pswd=None,
self,
uid,
email,
capability_url,
pswd=None,
):

assert isinstance(uid, str), "Enter your login user id as a string"
Expand Down Expand Up @@ -63,7 +67,9 @@ def _start_session(self):

response = None
response = requests.post(
token_api_url, json=data, headers={"Accept": "application/json", "Client-Id": "icepyx"}
token_api_url,
json=data,
headers={"Accept": "application/json", "Client-Id": "icepyx"},
)

# check for a valid login
Expand Down Expand Up @@ -102,7 +108,9 @@ def login(self):
Then change the permissions of that file to 600
This will allow you to have read and write access to the file
No other user can access the file
$ chmod 600 ~/.netrc
The function checks for this file to retrieve credentials, prior to
prompting for manual input.
Expand Down
102 changes: 51 additions & 51 deletions icepyx/core/variables.py
Original file line number Diff line number Diff line change
Expand Up @@ -33,7 +33,7 @@ class Variables:
Properly formatted string specifying a valid version of the ICESat-2 dataset
source : string, default None
For vartype file, a path to a directory or single input source files (not yet implemented)
"""
"""

def __init__(
self,
Expand Down Expand Up @@ -77,7 +77,7 @@ def avail(self, options=False, internal=False):
>>> reg_a.earthdata_login(user_id,user_email)
Earthdata Login password: ········
>>> reg_a.order_vars.avail()
['ancillary_data/atlas_sdp_gps_epoch',
['ancillary_data/atlas_sdp_gps_epoch',
'ancillary_data/control',
'ancillary_data/data_end_utc',
'ancillary_data/data_start_utc',
Expand Down Expand Up @@ -140,39 +140,39 @@ def parse_var_list(varlist):
>>> var_dict.keys()
dict_keys(['atlas_sdp_gps_epoch', 'control', 'data_end_utc', 'data_start_utc',
'end_cycle', 'end_delta_time', 'end_geoseg', 'end_gpssow', 'end_gpsweek',
'end_orbit', 'end_region', 'end_rgt', 'granule_end_utc', 'granule_start_utc',
'qa_at_interval', 'release', 'start_cycle', 'start_delta_time', 'start_geoseg',
'start_gpssow', 'start_gpsweek', 'start_orbit', 'start_region', 'start_rgt',
'version', 'dt_hist', 'fit_maxiter', 'fpb_maxiter', 'maxiter', 'max_res_ids',
'min_dist', 'min_gain_th', 'min_n_pe', 'min_n_sel', 'min_signal_conf', 'n_hist',
'nhist_bins', 'n_sigmas', 'proc_interval', 'qs_lim_bsc', 'qs_lim_hrs', 'qs_lim_hsigma',
'qs_lim_msw', 'qs_lim_snr', 'qs_lim_sss', 'rbin_width', 'sigma_beam', 'sigma_tx',
't_dead', 'atl06_quality_summary', 'delta_time', 'h_li', 'h_li_sigma', 'latitude',
'longitude', 'segment_id', 'sigma_geo_h', 'fpb_mean_corr', 'fpb_mean_corr_sigma',
'fpb_med_corr', 'fpb_med_corr_sigma', 'fpb_n_corr', 'med_r_fit', 'tx_mean_corr',
'end_orbit', 'end_region', 'end_rgt', 'granule_end_utc', 'granule_start_utc',
'qa_at_interval', 'release', 'start_cycle', 'start_delta_time', 'start_geoseg',
'start_gpssow', 'start_gpsweek', 'start_orbit', 'start_region', 'start_rgt',
'version', 'dt_hist', 'fit_maxiter', 'fpb_maxiter', 'maxiter', 'max_res_ids',
'min_dist', 'min_gain_th', 'min_n_pe', 'min_n_sel', 'min_signal_conf', 'n_hist',
'nhist_bins', 'n_sigmas', 'proc_interval', 'qs_lim_bsc', 'qs_lim_hrs', 'qs_lim_hsigma',
'qs_lim_msw', 'qs_lim_snr', 'qs_lim_sss', 'rbin_width', 'sigma_beam', 'sigma_tx',
't_dead', 'atl06_quality_summary', 'delta_time', 'h_li', 'h_li_sigma', 'latitude',
'longitude', 'segment_id', 'sigma_geo_h', 'fpb_mean_corr', 'fpb_mean_corr_sigma',
'fpb_med_corr', 'fpb_med_corr_sigma', 'fpb_n_corr', 'med_r_fit', 'tx_mean_corr',
tx_med_corr', 'dem_flag', 'dem_h', 'geoid_h', 'dh_fit_dx', 'dh_fit_dx_sigma', '
dh_fit_dy', 'h_expected_rms', 'h_mean', 'h_rms_misfit', 'h_robust_sprd',
'n_fit_photons', 'n_seg_pulses', 'sigma_h_mean', 'signal_selection_source',
'signal_selection_source_status', 'snr', 'snr_significance', 'w_surface_window_final',
ckgrd', 'bsnow_conf', 'bsnow_h', 'bsnow_od', 'cloud_flg_asr', 'cloud_flg_atm', 'dac',
'e_bckgrd', 'layer_flag', 'msw_flag', 'neutat_delay_total', 'r_eff', 'solar_azimuth',
'solar_elevation', 'tide_earth', 'tide_equilibrium', 'tide_load', 'tide_ocean',
'tide_pole', 'ref_azimuth', 'ref_coelv', 'seg_azimuth', 'sigma_geo_at', 'sigma_geo_r',
igma_geo_xt', 'x_atc', 'y_atc', 'bckgrd_per_m', 'bin_top_h', 'count', 'ds_segment_id',
'lat_mean', 'lon_mean', 'pulse_count', 'segment_id_list', 'x_atc_mean', 'record_number',
'reference_pt_lat', 'reference_pt_lon', 'signal_selection_status_all',
'signal_selection_status_backup', 'signal_selection_status_confident', 'crossing_time',
'cycle_number', 'lan', 'orbit_number', 'rgt', 'sc_orient', 'sc_orient_time',
'qa_granule_fail_reason', 'qa_granule_pass_fail', 'signal_selection_source_fraction_0',
'signal_selection_source_fraction_1', 'signal_selection_source_fraction_2',
dh_fit_dy', 'h_expected_rms', 'h_mean', 'h_rms_misfit', 'h_robust_sprd',
'n_fit_photons', 'n_seg_pulses', 'sigma_h_mean', 'signal_selection_source',
'signal_selection_source_status', 'snr', 'snr_significance', 'w_surface_window_final',
ckgrd', 'bsnow_conf', 'bsnow_h', 'bsnow_od', 'cloud_flg_asr', 'cloud_flg_atm', 'dac',
'e_bckgrd', 'layer_flag', 'msw_flag', 'neutat_delay_total', 'r_eff', 'solar_azimuth',
'solar_elevation', 'tide_earth', 'tide_equilibrium', 'tide_load', 'tide_ocean',
'tide_pole', 'ref_azimuth', 'ref_coelv', 'seg_azimuth', 'sigma_geo_at', 'sigma_geo_r',
igma_geo_xt', 'x_atc', 'y_atc', 'bckgrd_per_m', 'bin_top_h', 'count', 'ds_segment_id',
'lat_mean', 'lon_mean', 'pulse_count', 'segment_id_list', 'x_atc_mean', 'record_number',
'reference_pt_lat', 'reference_pt_lon', 'signal_selection_status_all',
'signal_selection_status_backup', 'signal_selection_status_confident', 'crossing_time',
'cycle_number', 'lan', 'orbit_number', 'rgt', 'sc_orient', 'sc_orient_time',
'qa_granule_fail_reason', 'qa_granule_pass_fail', 'signal_selection_source_fraction_0',
'signal_selection_source_fraction_1', 'signal_selection_source_fraction_2',
'signal_selection_source_fraction_3'])
>>> import numpy
>>> numpy.unique(paths)
array(['ancillary_data', 'bias_correction', 'dem', 'fit_statistics',
'geophysical', 'ground_track', 'gt1l', 'gt1r', 'gt2l', 'gt2r',
'gt3l', 'gt3r', 'land_ice', 'land_ice_segments', 'none',
'orbit_info', 'quality_assessment', 'residual_histogram',
'segment_quality', 'signal_selection_status'], dtype='<U23')
'geophysical', 'ground_track', 'gt1l', 'gt1r', 'gt2l', 'gt2r',
'gt3l', 'gt3r', 'land_ice', 'land_ice_segments', 'none',
'orbit_info', 'quality_assessment', 'residual_histogram',
'segment_quality', 'signal_selection_status'], dtype='<U23')
"""

# create a dictionary of variable names and paths
Expand Down Expand Up @@ -336,23 +336,23 @@ def _iter_paths(self, sum_varlist, req_vars, vgrp, beam_list, keyword_list):
# DevGoal: we can ultimately add an "interactive" trigger that will open the not-yet-made widget. Otherwise, it will use the var_list passed by the user/defaults
def append(self, defaults=False, var_list=None, beam_list=None, keyword_list=None):
"""
Add to the list of desired variables using user specified beams and variable list.
A pregenerated default variable list can be used by setting defaults to True.
Add to the list of desired variables using user specified beams and variable list.
A pregenerated default variable list can be used by setting defaults to True.
Note: The calibrated backscatter cab_prof is not in the default list for ATL09
Parameters
----------
defaults : boolean, default False
Include the variables in the default variable list. Defaults are defined per-data product.
Include the variables in the default variable list. Defaults are defined per-data product.
When specified in conjuction with a var_list, default variables not on the user-
specified list will be added to the order.
var_list : list of strings, default None
A list of variables to request, if not all available variables are wanted.
A list of variables to request, if not all available variables are wanted.
A list of available variables can be obtained by entering `var_list=['']` into the function.
beam_list : list of strings, default None
A list of beam strings, if only selected beams are wanted (the default value of None will automatically
A list of beam strings, if only selected beams are wanted (the default value of None will automatically
include all beams). For ATL09, acceptable values are ['profile_1', 'profile_2', 'profile_3'].
For all other datasets, acceptable values are ['gt1l', 'gt1r', 'gt2l', 'gt2r', 'gt3l', 'gt3r'].
Expand All @@ -374,19 +374,19 @@ def append(self, defaults=False, var_list=None, beam_list=None, keyword_list=Non
Earthdata Login password: ········
To add all variables related to a specific ICESat-2 beam
>>> reg_a.order_vars.append(beam_list=['gt1r'])
To include the default variables:
>>> reg_a.order_vars.append(defaults=True)
To add specific variables in orbit_info
>>> reg_a.order_vars.append(keyword_list=['orbit_info'],var_list=['sc_orient_time'])
To add all variables and paths in ancillary_data
>>> reg_a.order_vars.append(keyword_list=['ancillary_data'])
"""

Expand Down Expand Up @@ -454,19 +454,19 @@ def append(self, defaults=False, var_list=None, beam_list=None, keyword_list=Non
def remove(self, all=False, var_list=None, beam_list=None, keyword_list=None):
"""
Remove the variables and paths from the wanted list using user specified beam, keyword,
and variable lists.
and variable lists.
Parameters:
-----------
all : boolean, default False
Remove all variables and paths from the wanted list.
var_list : list of strings, default None
A list of variables to request, if not all available variables are wanted.
A list of variables to request, if not all available variables are wanted.
A list of available variables can be obtained by entering `var_list=['']` into the function.
beam_list : list of strings, default None
A list of beam strings, if only selected beams are wanted (the default value of None will automatically
A list of beam strings, if only selected beams are wanted (the default value of None will automatically
include all beams). For ATL09, acceptable values are ['profile_1', 'profile_2', 'profile_3'].
For all other datasets, acceptable values are ['gt1l', 'gt1r', 'gt2l', 'gt2r', 'gt3l', 'gt3r'].
Expand All @@ -487,19 +487,19 @@ def remove(self, all=False, var_list=None, beam_list=None, keyword_list=None):
Earthdata Login password: ········
To clear the list of wanted variables
>>> reg_a.order_vars.remove(all=True)
To remove all variables related to a specific ICESat-2 beam
>>> reg_a.order_vars.remove(beam_list=['gt1r'])
To remove specific variables in orbit_info
>>> reg_a.order_vars.remove(keyword_list=['orbit_info'],var_list=['sc_orient_time'])
To remove all variables and paths in ancillary_data
>>> reg_a.order_vars.remove(keyword_list=['ancillary_data'])
"""

Expand Down

0 comments on commit a582a27

Please sign in to comment.