Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Stingray Submission #201

Open
19 of 31 tasks
matteobachetti opened this issue Jun 14, 2024 · 19 comments
Open
19 of 31 tasks

Stingray Submission #201

matteobachetti opened this issue Jun 14, 2024 · 19 comments
Assignees
Labels
3/reviewers-assigned astropy An astropy community affiliated package review

Comments

@matteobachetti
Copy link

matteobachetti commented Jun 14, 2024

Submitting Author: Name @matteobachetti
All current maintainers: @dhuppenkothen @mgullik @jdswinbank @matteolucchini1
Package Name: Stingray
One-Line Description of Package: A spectral-timing software package for astrophysical X-ray (and other) data
Repository Link: https://github.com/stingraysoftware/stingray

Version submitted: 2.1
EiC: @cmarmo
Editor: @hamogu
Reviewer 1: TBD
Reviewer 2: TBD
Archive: TBD
JOSS DOI: TBD
Version accepted: TBD
Date accepted (month/day/year): TBD


Code of Conduct & Commitment to Maintain Package

Description

  • Include a brief paragraph describing what your package does:

Stingray is a Python library for "spectral timing", i.e. time-series analysis techniques that can be used to study how variability changes or correlates between different energy bands/wavelengths. It is Astropy-affiliated, and with an ever growing user base now comprising hundreds of researchers around the globe.

Scope

  • Please indicate which category or categories.
    Check out our package scope page to learn more about our
    scope. (If you are unsure of which category you fit, we suggest you make a pre-submission inquiry):

    • Data retrieval
    • Data extraction
    • Data processing/munging
    • Data deposition
    • Data validation and testing
    • Data visualization1
    • Workflow automation
    • Citation management and bibliometrics
    • Scientific software wrappers
    • Database interoperability

Domain Specific

  • Geospatial
  • Education

Community Partnerships

If your package is associated with an
existing community please check below:

  • For all submissions, explain how and why the package falls under the categories you indicated above. In your explanation, please address the following points (briefly, 1-2 sentences for each):

This package is mostly focused on the analysis of new data from high-energy missions. We added data validation and testing because it can be used as a quick look tool to point out possible anomalies in observations (e.g. solar or other background flares).

  • Who is the target audience and what are scientific applications of this package?

The target audience is principally researchers and students of X-ray and multi-wavelength astronomy.
This package fills a void for a free and open source (spectral-)timing package for X-ray astronomy. XRONOS, formerly maintained by HEASARC, is currently unmaintained, and the analysis of high-energy timeseries is done mostly with proprietary software or mission-specific packages. We provide a Python package that eases the learning curve for newcomers, also thanks to extensive tutorials based on Jupyter notebooks, and provides experts with a powerful, robust library for their analysis. We provide software to analyze astronomical time series and do a number of things, including periodicity searches, time lag calculations, covariance spectra, power spectral modeling.

  • Are there other Python packages that accomplish the same thing? If so, how does yours differ?

Our package is arguably the most well-known Python package for X-ray spectral timing.

  • If you made a pre-submission enquiry, please paste the link to the corresponding issue, forum post, or other discussion, or @tag the editor you contacted:

#195

Technical checks

For details about the pyOpenSci packaging requirements, see our packaging guide. Confirm each of the following by checking the box. This package:

  • does not violate the Terms of Service of any service it interacts with.
  • uses an OSI approved license.
  • [ x contains a README with instructions for installing the development version.
  • includes documentation with examples for all functions.
  • contains a tutorial with examples of its essential functions and uses.
  • has a test suite.
  • has continuous integration setup, such as GitHub Actions CircleCI, and/or others.

Publication Options

JOSS Checks
  • The package has an obvious research application according to JOSS's definition in their submission requirements. Be aware that completing the pyOpenSci review process does not guarantee acceptance to JOSS. Be sure to read their submission requirements (linked above) if you are interested in submitting to JOSS.
  • The package is not a "minor utility" as defined by JOSS's submission requirements: "Minor ‘utility’ packages, including ‘thin’ API clients, are not acceptable." pyOpenSci welcomes these packages under "Data Retrieval", but JOSS has slightly different criteria.
  • The package contains a paper.md matching JOSS's requirements with a high-level description in the package root or in inst/. We have a previous JOSS paper describing the package. Is it acceptable to have a new paper, including the many improvements made in the last five years?
  • The package is deposited in a long-term repository with the DOI:

Note: JOSS accepts our review as theirs. You will NOT need to go through another full review. JOSS will only review your paper.md file. Be sure to link to this pyOpenSci issue when a JOSS issue is opened for your package. Also be sure to tell the JOSS editor that this is a pyOpenSci reviewed package once you reach this step.

Are you OK with Reviewers Submitting Issues and/or pull requests to your Repo Directly?

This option will allow reviewers to open smaller issues that can then be linked to PR's rather than submitting a more dense text based review. It will also allow you to demonstrate addressing the issue via PR links.

  • Yes I am OK with reviewers submitting requested changes as issues to my repo. Reviewers will then link to the issues in their submitted review.

Confirm each of the following by checking the box.

  • I have read the author guide.
  • I expect to maintain this package for at least 2 years and can help find a replacement for the maintainer (team) if needed.

Please fill out our survey

P.S. Have feedback/comments about our review process? Leave a comment here

Editor and Review Templates

The editor template can be found here.

The review template can be found here.

Footnotes

  1. Please fill out a pre-submission inquiry before submitting a data visualization package.

@cmarmo cmarmo added the astropy An astropy community affiliated package review label Jul 1, 2024
@cmarmo
Copy link
Member

cmarmo commented Jul 1, 2024

Editor in Chief checks

Hi there! I'm Chiara and I am taking care of the Editor in chief role for the next three months.
Thank you for submitting your package for pyOpenSci review.
Below are the basic checks that your package needs to pass to begin our review.
If some of these are missing, we will ask you to work on them before the review process begins.

Please check our Python packaging guide for more information on the elements
below.

  • Installation The package can be installed from a community repository such as PyPI (preferred), and/or a community channel on conda (e.g. conda-forge, bioconda).
    • The package imports properly into a standard Python environment import package.
  • Fit The package meets criteria for fit and overlap.
  • Documentation The package has sufficient online documentation to allow us to evaluate package function and scope without installing the package. This includes:
    • User-facing documentation that overviews how to install and start using the package.
    • Short tutorials that help a user understand how to use the package and what it can do for them.
    • API documentation (documentation for your code's functions, classes, methods and attributes): this includes clearly written docstrings with variables defined using a standard docstring format.
  • Core GitHub repository Files
    • README The package has a README.md file with clear explanation of what the package does, instructions on how to install it, and a link to development instructions.
    • Contributing File The package has a CONTRIBUTING.md file that details how to install and contribute to the package.
    • Code of Conduct The package has a CODE_OF_CONDUCT.md file.
    • License The package has an OSI approved license.
      NOTE: We prefer that you have development instructions in your documentation too.
  • Issue Submission Documentation All of the information is filled out in the YAML header of the issue (located at the top of the issue template).
  • Automated tests Package has a testing suite and is tested via a Continuous Integration service.
  • Repository The repository link resolves correctly.
  • Package overlap The package doesn't entirely overlap with the functionality of other packages that have already been submitted to pyOpenSci.
  • Archive (JOSS only, may be post-review): The repository DOI resolves correctly.
  • Version (JOSS only, may be post-review): Does the release version given match the GitHub release (v1.0.0)?

  • Initial onboarding survey was filled out
    We appreciate each maintainer of the package filling out this survey individually. 🙌
    Thank you authors in advance for setting aside five to ten minutes to do this. It truly helps our organization. 🙌


Editor comments

Stingray is in excellent form! 💪
Congratulations!
I am going to check the availability of our Astropy editors and we will be back to you shortly.

@cmarmo
Copy link
Member

cmarmo commented Jul 2, 2024

Hello @matteobachetti I am pleased to announce that @hamogu will take care of your submission as editor.
Thank you Moritz!
I'm wishing to all of you a nice review process! :)

@hamogu
Copy link

hamogu commented Jul 2, 2024

Editor response to review:


Editor comments

👋 Hi @taldcroft and @masonng-astro! Thank you for volunteering to review
for pyOpenSci!

Please fill out our pre-review survey

Before beginning your review, please fill out our pre-review survey. This helps us improve all aspects of our review and better understand our community. No personal data will be shared from this survey - it will only be used in an aggregated format by our Executive Director to improve our processes and programs.

  • reviewer 1 survey completed.
  • reviewer 2 survey completed.
  • reviewer 3 (if applicable)

The following resources will help you complete your review:

  1. Here is the reviewers guide. This guide contains all of the steps and information needed to complete your review.
  2. Here is the review template that you will need to fill out and submit
    here as a comment, once your review is complete.

Please get in touch with any questions or concerns! Your review is due:

Reviewers: Tom Aldcroft and Mason Ng
Due date: Sept 6th, 2024

@hamogu
Copy link

hamogu commented Jul 2, 2024

We have a previous JOSS paper describing the package. Is it acceptable to have a new paper, including the many improvements made in the last five years?

It's up to JOSS, not us, how exactly they handle that. Yo save everyone time, JOSS will generally accept our review as theirs, because there is really no need for a second set of reviewers that do essentially the same checks that our reviewers just did. However the final decision to accept into JOSS is up to JOSS.

So, I think there are two ways this can be done, at out choice:

  • Either you contact JOSS and ask for their opinion. (In that case, I would pause this review until you hear back from them.)
  • Or you write the paper.md as you would like to have it, and we review the code and paper as we normally do and pass it to JOSS once our review is done. (In that case, I would pause this review until you've written the paper.md so that our reviewers can actually review the paper.)

Option 2 is faster (you don't have to wait for the JOSS response first), but carries the risk that the JOSS editor decides not to publish it later.

Note that JOSS says: "we also review submissions that are significant contributions made to existing packages".
Based on this, I suggest option 2 (you add a 'paper.md` here before we review) and I will specifically instruct our reviewers to comment on if they judge the development in the last 5 years to be a "significant contribution".

@matteobachetti
Copy link
Author

matteobachetti commented Jul 4, 2024

@hamogu we are working on the JOSS paper draft here: StingraySoftware/stingray#829
The text will not change much, we are working mostly on the author list and acknowledgements.
The new paper is paper2.md, and the bibliography joss2.bib (because we have the 2019 paper in the same directory)

@hamogu
Copy link

hamogu commented Jul 8, 2024

Just let me know when the paper is ready and I'll let the reviewers know to start their review. In the meantime, I'm emailing reviewers to find people to do the review when you are ready.

@matteobachetti
Copy link
Author

@hamogu the paper is ready, but I'm still missing a couple OrcIDs and affiliations. Does it have to be merged or the link to the PR is sufficient?

@hamogu
Copy link

hamogu commented Jul 17, 2024

Just letting you know that I'm in the process of looking for reviewers; both I and some potential reviewers are on summer travel, but I'm hoping to get you two reviewers soon to get this started.

@hamogu
Copy link

hamogu commented Aug 12, 2024

Just letting you know that I'm still looking for reviewers. Seems that everyone is on vacation right now...

@matteobachetti
Copy link
Author

Understandable 😅
No worries!

@hamogu
Copy link

hamogu commented Aug 12, 2024

👋 Hi @taldcroft and @masonng-astro!
Thanks for agreeing to do a review of stingray for us. Generic instructions (pre-review survey, etc.) are above: #201 (comment)

In addition, note that this package is also submitting to JOSS and JOSS will accept our review and not do a separate review. However, stingray already had a JOSS publication a few years ago. So, it would be very helpful to note in your review if you deem the new features added since then as new "substantial contribution" that merits a new JOSS publication. You can find the paper draft for JOSS in StingraySoftware/stingray#829

Please reach out to me if you have any questions!

@hamogu
Copy link

hamogu commented Sep 6, 2024

👋 Hi @taldcroft and @masonng-astro! I'm hoping to get your reviews for this package soon! If you have any problems or need help or guidance in how to do a review, please reach out to me any time!

@masonng-astro
Copy link

masonng-astro commented Sep 6, 2024

(Apologies for any erroneous formatting)

Package Review

Please check off boxes as applicable, and elaborate in comments below. Your review is not limited to these topics, as described in the reviewer guide*

  • As the reviewer I confirm that there are no conflicts of interest for me to review this work (If you are unsure whether you are in conflict, please speak to your editor before starting your review).

Documentation

The package includes all the following forms of documentation:

  • A statement of need clearly stating problems the software is designed to solve and its target audience in README.
  • Installation instructions: for the development version of the package and any non-standard dependencies in README.
  • Vignette(s) demonstrating major functionality that runs successfully locally.
  • Function Documentation: for all user-facing functions.
  • Examples for all user-facing functions.
  • Community guidelines including contribution guidelines in the README or CONTRIBUTING.
  • Metadata including author(s), author e-mail(s), a url, and any other relevant metadata e.g., in a pyproject.toml file or elsewhere.
    • It looks like there is a list in “CREDITS.rst” but is not part of the metadata. There are no explicit contact details attached either.

Readme file requirements
The package meets the readme requirements below:

  • Package has a README.md file in the root directory.

The README should include, from top to bottom:

  • The package name
  • Badges for:
    • Continuous integration and test coverage,
    • Docs building (if you have a documentation website),
    • A repostatus.org badge,
      • This is not present, as far as I can tell
    • Python versions supported,
      • This is not present, as far as I can tell
    • Current package version (on PyPI / Conda).
  • Short description of package goals.
  • Package installation instructions
  • Any additional setup required to use the package (authentication tokens, etc.)
  • Descriptive links to all vignettes. If the package is small, there may only be a need for one vignette which could be placed in the README.md file.
    • Brief demonstration of package usage (as it makes sense - links to vignettes could also suffice here if package description is clear)
  • Link to your documentation website.
  • If applicable, how the package compares to other similar packages and/or how it relates to other packages in the scientific ecosystem.
  • Citation information

Usability

Reviewers are encouraged to submit suggestions (or pull requests) that will improve the usability of the package as a whole.
Package structure should follow general community best-practices. In general please consider whether:

  • Package documentation is clear and easy to find and use.
  • The need for the package is clear
  • All functions have documentation and associated examples for use
  • The package is easy to install

Functionality

  • Installation: Installation succeeds as documented.
  • Functionality: Any functional claims of the software been confirmed.
  • Performance: Any performance claims of the software been confirmed.
  • Automated tests:
    • All tests pass on the reviewer's local machine for the package version submitted by the author. Ideally this should be a tagged version making it easy for reviewers to install.
      • Almost every single test passed - see the code snippet at the very bottom of this review.
    • Tests cover essential functions of the package and a reasonable range of inputs and conditions.
  • Continuous Integration: Has continuous integration setup (We suggest using Github actions but any CI platform is acceptable for review)
  • Packaging guidelines: The package conforms to the pyOpenSci packaging guidelines.
    A few notable highlights to look at:
    • Package supports modern versions of Python and not End of life versions.
    • Not sure if this is explicitly noted, but as per above, there is no badge for which Python version are supported
    • Code format is standard throughout package and follows PEP 8 guidelines (CI tests for linting pass)

For packages also submitting to JOSS

Note: Be sure to check this carefully, as JOSS's submission requirements and scope differ from pyOpenSci's in terms of what types of packages are accepted.

The package contains a paper.md matching JOSS's requirements with:

  • A short summary describing the high-level functionality of the software
  • Authors: A list of authors with their affiliations
  • A statement of need clearly stating problems the software is designed to solve and its target audience.
  • References: With DOIs for all those that have one (e.g. papers, datasets, software).

Final approval (post-review)

  • The author has responded to my review and made changes to my satisfaction. I recommend approving this package.

Estimated hours spent reviewing: 6

---

Review Comments

As a frequent user of Stingray, I have casually followed the development of Stingray over the years. It has evolved from its previous iteration and from past projects (Stingray, HENDRICS, etc.) into a full software suite of spectral timing tools applicable to event data. I have always found the documentation to be extensive and comprehensive, with pedagogical Jupyter notebook-style tutorials to get started. My comments are thus fairly minor. I also note the absence of several items from the checklist above, though please let me know if I missed something.

On the JOSS publication, I believe the addition of new features, the integration of additional data formats, and increased performance of the codebase merits a new JOSS publication. I just have very minor comments below.

  • Consistency between using “modelling” and “modeling”, except for the reference to the subpackage
  • “analyse” was written in the text instead of “analyze” (I assume the authors are adopting US English spelling throughout)
  • The hyperlinks to “our benchmarks” and “Lightkurve” point to wrong locations
  • I believe the authors will be adding the references before final approval

I have glanced through all of the notebooks, and I encountered an issue in the “Bexvar tutorial” notebook. This might be my set-up, but I get the error that

```python
ImportError: dlopen(/opt/homebrew/anaconda3/lib/python3.10/site-packages/ultranest/mlfriends.cpython-310-darwin.so, 0x0002): tried: '/Users/masonng/Documents/MIT/Research/software/heasoft-6.33/aarch64-apple-darwin22.1.0/mlfriends.cpython-310-darwin.so' (no such file), '/Users/masonng/Documents/MIT/Research/software/heasoft-6.33/aarch64-apple-darwin22.1.0/lib/mlfriends.cpython-310-darwin.so' (no such file), '/opt/homebrew/anaconda3/lib/python3.10/site-packages/ultranest/mlfriends.cpython-310-darwin.so' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64')), '/System/Volumes/Preboot/Cryptexes/OS/opt/homebrew/anaconda3/lib/python3.10/site-packages/ultranest/mlfriends.cpython-310-darwin.so' (no such file), '/opt/homebrew/anaconda3/lib/python3.10/site-packages/ultranest/mlfriends.cpython-310-darwin.so' (mach-o file, but is an incompatible architecture (have 'x86_64', need 'arm64'))
```

which I wonder if it is because there is some incompatibility with the arm64 architecture. I did install UltraNest with “arch -arm64” but the file “mlfriends.cpython-310-darwin.so” is still in the x86_64 architecture. Apparently UltraNest does not explicitly support arm64 architecture (PyPI project here)?

I appreciate the “dependencies” sub-section under “installation instructions.” I would just like to include “UltraNest” to the list. Have there been tests done with the Apple Silicon chip-equipped devices as well? I encountered the above error with UltraNest for example, with my MacBook Pro M1; or this could be a personal problem.

```
>>> stingray.test()
=========================================================================================== test session starts ============================================================================================
platform darwin -- Python 3.10.12, pytest-7.1.2, pluggy-1.5.0

Running tests with stingray version 2.1.
Running tests in /opt/homebrew/anaconda3/lib/python3.10/site-packages/stingray.

Date: 2024-09-05T15:05:00

Platform: macOS-13.0.1-arm64-arm-64bit

Executable: /opt/homebrew/anaconda3/bin/python

Full Python Version:
3.10.12 | packaged by conda-forge | (main, Jun 23 2023, 22:41:52) [Clang 15.0.7 ]

encodings: sys: utf-8, locale: UTF-8, filesystem: utf-8
byteorder: little
float info: dig: 15, mant_dig: 15

Package versions:
Numpy: 1.23.5
Scipy: 1.10.0
Matplotlib: 3.7.0
h5py: 3.7.0
scikit-image: 0.19.3

Using Astropy options: remote_data: none.

rootdir: /opt/homebrew/anaconda3/lib/python3.10/site-packages/stingray
plugins: anyio-3.5.0, remotedata-0.4.1, doctestplus-1.2.1, astropy-header-0.2.2, asdf-3.4.0
collected 1650 items / 1 skipped

../deadtime/tests/test_fad.py ................                                                                                                                                                       [  0%]
../deadtime/tests/test_models.py ...............                                                                                                                                                     [  1%]
../mission_support/tests/test_mission_support.py ..                                                                                                                                                  [  2%]
../modeling/tests/test_parameterestimation.py ............s.........s...............................                                                                                                 [  5%]
../modeling/tests/test_posterior.py .....................................................                                                                                                            [  8%]
../modeling/tests/test_scripts.py .......                                                                                                                                                            [  8%]
../pulse/overlapandsave/test_ols.py ss...                                                                                                                                                            [  9%]
../pulse/tests/test_accelsearch.py ......                                                                                                                                                            [  9%]
../pulse/tests/test_modeling.py ........                                                                                                                                                             [ 10%]
../pulse/tests/test_pulse.py .sss...............................                                                                                                                                     [ 12%]
../pulse/tests/test_search.py ..............................                                                                                                                                         [ 14%]
../simulator/tests/test_base.py ..                                                                                                                                                                   [ 14%]
../simulator/tests/test_simulator.py ........................................................                                                                                                        [ 17%]
../simulator/tests/test_transfer.py ................                                                                                                                                                 [ 18%]
test_base.py .............................................................................................................................                                                           [ 26%]
test_bexvar.py sssss..s.s.                                                                                                                                                                           [ 26%]
test_bispectrum.py ............................                                                                                                                                                      [ 28%]
test_covariancespectrum.py ......................                                                                                                                                                    [ 29%]
test_crosscorrelation.py ...........................                                                                                                                                                 [ 31%]
test_crossspectrum.py .............................................................................................................................................................................. [ 41%]
...........................................                                                                                                                                                          [ 44%]
test_events.py ..............................................................                                                                                                                        [ 48%]
test_filters.py ..........                                                                                                                                                                           [ 48%]
test_fourier.py .................................................................................................................................................................................... [ 59%]
........                                                                                                                                                                                             [ 60%]
test_gti.py .........................................................                                                                                                                                [ 63%]
test_io.py ....F..............                                                                                                                                                                       [ 64%]
test_lightcurve.py .....s..............................................................................................ss........................................ss.......                           [ 74%]
test_lombscargle.py ...............................................                                                                                                                                  [ 76%]
test_multitaper.py ......................................                                                                                                                                            [ 79%]
test_power_colors.py .............                                                                                                                                                                   [ 80%]
test_powerspectrum.py .................................................................................................................................................................              [ 89%]
test_sampledata.py ....                                                                                                                                                                              [ 90%]
test_spectroscopy.py xX......                                                                                                                                                                        [ 90%]
test_stats.py .......................................................                                                                                                                                [ 93%]
test_utils.py ...................................................s..........                                                                                                                         [ 97%]
test_varenergyspectrum.py ........................................                                                                                                                                   [100%]

================================================================================================= FAILURES =================================================================================================
______________________________________________________________________________ TestIO.test_event_file_read_and_automatic_sort ______________________________________________________________________________

self = <stingray.tests.test_io.TestIO object at 0x167545360>

    def test_event_file_read_and_automatic_sort(self):
        """Test event file reading."""
        fname = os.path.join(datadir, "monol_testA_calib.evt")
        with pytest.warns(AstropyUserWarning, match="No valid GTI extensions"):
            evdata = load_events_and_gtis(fname)
        fname_unsrt = os.path.join(datadir, "monol_testA_calib_unsrt.evt")
>       with pytest.warns(UserWarning, match="not sorted. Sorting them for you"):
E       Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted. The list of emitted warnings is: [].

test_io.py:77: Failed
============================================================================================= warnings summary =============================================================================================
../../_pytest/config/__init__.py:1129
  /opt/homebrew/anaconda3/lib/python3.10/site-packages/_pytest/config/__init__.py:1129: PytestAssertRewriteWarning: Module already imported so cannot be rewritten: asdf
    self._mark_plugins_for_rewrite(hook)

../deadtime/tests/test_fad.py:22: 1 warning
../deadtime/tests/test_models.py:14: 1 warning
../modeling/tests/test_parameterestimation.py:34: 1 warning
../pulse/overlapandsave/test_ols.py:88: 1 warning
../pulse/tests/test_accelsearch.py:6: 1 warning
../pulse/tests/test_pulse.py:32: 1 warning
../pulse/tests/test_pulse.py:55: 1 warning
../pulse/tests/test_search.py:14: 1 warning
test_bexvar.py:11: 1 warning
test_crossspectrum.py:331: 1 warning
test_crossspectrum.py:909: 1 warning
test_crossspectrum.py:921: 1 warning
test_crossspectrum.py:938: 1 warning
test_crossspectrum.py:958: 1 warning
test_lightcurve.py:1605: 1 warning
test_multitaper.py:13: 1 warning
test_powerspectrum.py:210: 1 warning
test_spectroscopy.py:93: 1 warning
test_varenergyspectrum.py:171: 1 warning
test_varenergyspectrum.py:358: 1 warning
  PytestUnknownMarkWarning: Unknown pytest.mark.slow - is this a typo?  You can register custom marks to avoid this warning - for details, see <https://docs.pytest.org/en/stable/how-to/mark.html>

../../pint/pulsar_mjd.py:57
  RuntimeWarning: This platform does not support extended precision floating-point, and PINT will run at reduced precision.

deadtime/tests/test_fad.py: 2 warnings
tests/test_base.py: 3 warnings
tests/test_crossspectrum.py: 1 warning
tests/test_events.py: 2 warnings
tests/test_lightcurve.py: 1 warning
tests/test_powerspectrum.py: 1 warning
tests/test_varenergyspectrum.py: 1 warning
  UserWarning: table path was not set via the path= argument; using default path __astropy_table__

deadtime/tests/test_models.py::test_checkA[True]
deadtime/tests/test_models.py::test_checkA[False]
  UserWarning: All values for SymLogScale are below linthresh, making it effectively linear. You likely should lower the value of linthresh.

modeling/tests/test_parameterestimation.py::TestParameterEstimation::test_sampler_runs
  DeprecationWarning: The py23 module has been deprecated and will be removed in a future release. Please update your code.

modeling/tests/test_parameterestimation.py::TestPSDParEst::test_fit_method_works_with_correct_parameter
  RuntimeWarning: More than 20 figures have been opened. Figures created through the pyplot interface (`matplotlib.pyplot.figure`) are retained until explicitly closed and may consume too much memory. (To control this warning, see the rcParam `figure.max_open_warning`). Consider using `matplotlib.pyplot.close()`.

modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_crossspectrum
modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_powerspectrum
pulse/tests/test_accelsearch.py::TestAccelsearch::test_signal
pulse/tests/test_accelsearch.py::TestAccelsearch::test_signal_neg_fdot
pulse/tests/test_accelsearch.py::TestAccelsearch::test_noisy
pulse/tests/test_accelsearch.py::TestAccelsearch::test_noisy_neg_fdot
  RuntimeWarning: divide by zero encountered in log

modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_crossspectrum
modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_powerspectrum
  RuntimeWarning: divide by zero encountered in divide

modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_crossspectrum
modeling/tests/test_scripts.py::TestFitLorentzians::test_fit_powerspectrum
  RuntimeWarning: invalid value encountered in subtract

pulse/tests/test_modeling.py::test_gaussian_bounds
  RuntimeWarning: underflow encountered in exp

pulse/tests/test_modeling.py::test_gaussian_bounds
  RuntimeWarning: underflow encountered in multiply

pulse/tests/test_search.py: 1 warning
simulator/tests/test_simulator.py: 39 warnings
tests/test_crosscorrelation.py: 1 warning
tests/test_crossspectrum.py: 2 warnings
tests/test_lightcurve.py: 1 warning
tests/test_lombscargle.py: 3 warnings
tests/test_multitaper.py: 1 warning
tests/test_varenergyspectrum.py: 3 warnings
  UserWarning: SIMON says: Stingray only uses poisson err_dist at the moment. All analysis in the light curve will assume Poisson errors. Sorry for the inconvenience.

simulator/tests/test_simulator.py: 2 warnings
tests/test_crosscorrelation.py: 1 warning
tests/test_crossspectrum.py: 75 warnings
tests/test_spectroscopy.py: 32 warnings
tests/test_varenergyspectrum.py: 1 warning
  UserWarning: n_ave is below 30. Please note that the error bars on the quantities derived from the cross spectrum are only reliable for a large number of averaged powers.

simulator/tests/test_simulator.py: 2 warnings
tests/test_crosscorrelation.py: 1 warning
tests/test_crossspectrum.py: 97 warnings
tests/test_lombscargle.py: 10 warnings
tests/test_spectroscopy.py: 46 warnings
tests/test_varenergyspectrum.py: 2 warnings
  RuntimeWarning: invalid value encountered in sqrt

tests/test_bexvar.py::TestInternalFunctions::test_lscg_gen
  UnitsWarning: 'ratio' did not parse as fits unit: At col 0, Unit 'ratio' not supported by the FITS standard.  If this is meant to be a custom unit, define it with 'u.def_unit'. To have it recognized inside a file reader or other code, enable it with 'u.add_enabled_units'. For details, see <https://docs.astropy.org/en/latest/units/combining_and_defining.html>

tests/test_crosscorrelation.py::TestCrossCorrelation::test_simple_plot
tests/test_crosscorrelation.py::TestCrossCorrelation::test_plot_axis
tests/test_crosscorrelation.py::TestCrossCorrelation::test_plot_title
  UserWarning: Matplotlib is currently using agg, which is a non-GUI backend, so cannot show the figure.

tests/test_crossspectrum.py: 12 warnings
  UserWarning: Some error bars in the Averaged Crossspectrum are invalid.Defaulting to sqrt(2 / M) in Leahy norm, rescaled to the appropriate norm.

tests/test_crossspectrum.py::TestAveragedCrossspectrumEvents::test_internal_from_events_works_acs
tests/test_crossspectrum.py::TestAveragedCrossspectrumEvents::test_from_events_works_acs
tests/test_crossspectrum.py::TestAveragedCrossspectrumEvents::test_rebin
tests/test_crossspectrum.py::TestAveragedCrossspectrumEvents::test_rebin_factor
tests/test_crossspectrum.py::TestAveragedCrossspectrumEvents::test_rebin_log
tests/test_crossspectrum.py::TestAveragedCrossspectrum::test_rebin
tests/test_crossspectrum.py::TestAveragedCrossspectrum::test_rebin_factor
tests/test_crossspectrum.py::TestAveragedCrossspectrum::test_rebin_log
  UserWarning: SIMON says: Number of segments used in averaging is significantly low. The result might not follow the expected statistical distributions.

tests/test_crossspectrum.py: 9 warnings
tests/test_powerspectrum.py: 4 warnings
  RuntimeWarning: invalid value encountered in divide

tests/test_multitaper.py::TestMultitaper::test_make_multitaper_from_lightcurve[lc_gauss]
tests/test_multitaper.py::TestMultitaper::test_multitaper_lombscargle
  UserWarning: SIMON says: Looks like your lightcurve statistic is not poisson.The errors in the Powerspectrum will be incorrect.

tests/test_powerspectrum.py::TestPowerspectrum::test_classical_significances_trial_correction
tests/test_powerspectrum.py::TestPowerspectrum::test_pvals_is_numpy_array
  RuntimeWarning: underflow encountered in _logp_multitrial_from_single_logp

-- Docs: <https://docs.pytest.org/en/stable/how-to/capture-warnings.html>
========================================================================================= short test summary info ==========================================================================================
FAILED test_io.py::TestIO::test_event_file_read_and_automatic_sort - Failed: DID NOT WARN. No warnings of type (<class 'UserWarning'>,) were emitted. The list of emitted warnings is: [].
======================================================== 1 failed, 1627 passed, 21 skipped, 1 xfailed, 1 xpassed, 410 warnings in 143.24s (0:02:23) ========================================================
<ExitCode.TESTS_FAILED: 1>

```

@taldcroft
Copy link

@hamogu - sorry, this fell off my radar. Will do early next week.

@matteobachetti
Copy link
Author

Thanks for the ongoing reviews!
FYI, I've started addressing comments on readme/docs at StingraySoftware/stingray#842

@taldcroft
Copy link

taldcroft commented Sep 9, 2024

Package Review

Please check off boxes as applicable, and elaborate in comments below. Your review is not limited to these topics, as described in the reviewer guide

  • As the reviewer I confirm that there are no conflicts of interest for me to review this work (If you are unsure whether you are in conflict, please speak to your editor before starting your review).

Documentation

The package includes all the following forms of documentation:

  • A statement of need clearly stating problems the software is designed to solve and its target audience in README.
  • Installation instructions: for the development version of the package and any non-standard dependencies in README.
  • Vignette(s) demonstrating major functionality that runs successfully locally.
  • Function Documentation: for all user-facing functions.
  • Examples for all user-facing functions.
  • Community guidelines including contribution guidelines in the README or CONTRIBUTING.
  • Metadata including author(s), author e-mail(s), a url, and any other relevant metadata e.g., in a pyproject.toml file or elsewhere.
    • This information is in setup.cfg

Readme file requirements
The package meets the readme requirements below:

  • Package has a README.md file in the root directory.

The README should include, from top to bottom:

  • The package name
  • Badges for:
    • Continuous integration and test coverage,
    • Docs building (if you have a documentation website),
    • A repostatus.org badge,
      • Not included. I somewhat question the utility of this badge since it seems to rely
        on the package maintainers to keep it up to date. An abandoned project is not
        likely to be marked as such.
    • Python versions supported,
      • Looks like Py 3.8 to 3.12 are tested, but I do not see this in the install docs.
    • Current package version (on PyPI / Conda).

NOTE: If the README has many more badges, you might want to consider using a table for badges: see this example. Such a table should be more wide than high. (Note that the a badge for pyOpenSci peer-review will be provided upon acceptance.)

  • Short description of package goals.
  • Package installation instructions
  • Any additional setup required to use the package (authentication tokens, etc.)
  • Descriptive links to all vignettes. If the package is small, there may only be a need for one vignette which could be placed in the README.md file.
    • Brief demonstration of package usage (as it makes sense - links to vignettes could also suffice here if package description is clear)
  • Link to your documentation website.
  • If applicable, how the package compares to other similar packages and/or how it relates to other packages in the scientific ecosystem.
  • Citation information

Usability

Reviewers are encouraged to submit suggestions (or pull requests) that will improve the usability of the package as a whole.
Package structure should follow general community best-practices. In general please consider whether:

  • Package documentation is clear and easy to find and use.
  • The need for the package is clear
  • All functions have documentation and associated examples for use
  • The package is easy to install

Functionality

  • Installation: Installation succeeds as documented.
  • Functionality: Any functional claims of the software been confirmed.
    • I do not have the domain knowledge or resources to confirm the functional claims of
      the software with regards to X-ray spectral timing. The extensive collection of
      notebooks should allow the motivated scientist to evaluate and understand the
      results from this package.
  • Performance: Any performance claims of the software been confirmed.
    • Documentation makes no specific performance claims.
  • Automated tests:
    • All tests pass on the reviewer's local machine for the package version submitted by the author. Ideally this should be a tagged version making it easy for reviewers to install.
    • Tests cover essential functions of the package and a reasonable range of inputs and conditions.
  • Continuous Integration: Has continuous integration setup (We suggest using Github actions but any CI platform is acceptable for review)
  • Packaging guidelines: The package conforms to the pyOpenSci packaging guidelines.
    A few notable highlights to look at:
    • Package supports modern versions of Python and not End of life versions.
    • Code format is standard throughout package and follows PEP 8 guidelines (CI tests for linting pass)

For packages also submitting to JOSS

Note: Be sure to check this carefully, as JOSS's submission requirements and scope differ from pyOpenSci's in terms of what types of packages are accepted.

The package contains a paper.md matching JOSS's requirements with:

  • A short summary describing the high-level functionality of the software
  • Authors: A list of authors with their affiliations
  • A statement of need clearly stating problems the software is designed to solve and its target audience.
  • References: With DOIs for all those that have one (e.g. papers, datasets, software).

Final approval (post-review)

  • The author has responded to my review and made changes to my satisfaction. I recommend approving this package.

Estimated hours spent reviewing: 4


Review Comments

I am reviewing this package as an (inactive) astronomer and active Astropy developer.

From a scientific perspective it appears that the stingray package fills a distinct
need for modern open source Python-based X-ray spectral timing tools. It is being widely
used in the X-ray timing community, with 170 citations to the 2019 ApJ paper. In recent
years it is seeing around 35 citations per year.

From a software perspective this checks all the boxes (literally) for a package that
meets current standards for a well-developed, documented, and maintained package. The
test coverage is excellent.

I ran all the notebooks and found a couple of problems (documented as issues). This
highlights the need for StingraySoftware/stingray#802 to be
worked, but overall things are looking OK and those notebook issues are not a blocker
for accepting the package.

I opened one issue to document how to download the tutorial notebooks that form most of
the package functional documentation. This is a simple docs update and I would like to
see this done for accepting the package.

@matteobachetti
Copy link
Author

Update:

@matteobachetti
Copy link
Author

@masonng-astro I have a similar setup to yours (Macbook pro with M1 processor), but I couldn't reproduce your bugs 🆘

The failing test on unsorted data is particularly weird, as it does not seem to rely on any specific dependency other than the default ones. There is a vague chance that it is related to the numba installation, as I use numba to check for sortedness, but it should also revert to standard numpy if numba is not installed or it fails.

I could also install UltraNest without issues. I wonder if you have conflicting versions of some dependencies? From the list of libraries in your error, there seems to be some kind of clash between HEASOFT-installed libraries and the standard Python path. Is it possible?

@taldcroft
Copy link

FWIW as noted in my review, tests passed for me on Mac Silicon and Python 3.12. I created a fresh conda environment (using conda defaults channel) for Python 3.12 and then pip installed everything else.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
3/reviewers-assigned astropy An astropy community affiliated package review
Projects
Status: under-review
Development

No branches or pull requests

5 participants