Joss-reviews: [REVIEW]: phasespace: $n$-body phase space generation in Python

Created on 15 Jul 2019  ยท  48Comments  ยท  Source: openjournals/joss-reviews

Submitting author: @apuignav (Albert Puig Navarro)
Repository: https://github.com/zfit/phasespace/
Version: 1.0.2
Editor: @poulson
Reviewer: @mdoucet, @stuartcampbell, @vyasr
Archive: 10.5281/zenodo.2591993

Status

status

Status badge code:

HTML: <a href="http://joss.theoj.org/papers/8d627191394fa68c607459480e0b4b6f"><img src="http://joss.theoj.org/papers/8d627191394fa68c607459480e0b4b6f/status.svg"></a>
Markdown: [![status](http://joss.theoj.org/papers/8d627191394fa68c607459480e0b4b6f/status.svg)](http://joss.theoj.org/papers/8d627191394fa68c607459480e0b4b6f)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@mdoucet & @stuartcampbell & @vyasr, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @poulson know.

โœจ Please try and complete your review in the next two weeks โœจ

Review checklist for @mdoucet

Conflict of interest

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Version: Does the release version given match the GitHub release (1.0.2)?
  • [x] Authorship: Has the submitting author (@apuignav) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Authors: Does the paper.md file include a list of authors with their affiliations?
  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

Review checklist for @stuartcampbell

Conflict of interest

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Version: Does the release version given match the GitHub release (1.0.2)?
  • [x] Authorship: Has the submitting author (@apuignav) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Authors: Does the paper.md file include a list of authors with their affiliations?
  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?

Review checklist for @vyasr

Conflict of interest

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Version: Does the release version given match the GitHub release (1.0.2)?
  • [x] Authorship: Has the submitting author (@apuignav) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Authors: Does the paper.md file include a list of authors with their affiliations?
  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?
accepted published recommend-accept review

Most helpful comment

Looks good! I'm all set with my review now.

One minor tip, the docstrings are pretty inconsistent with using tensor, Tensor, and tf.Tensor. If you consistently refer to tensorflow.Tensor using the fully qualified name (it might be possible to use aliases, I haven't tried) you can take advantage of Intersphinx to get links to show up in your documentation.

All 48 comments

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @mdoucet, @stuartcampbell, @vyasr it looks like you're currently assigned to review this paper :tada:.

:star: Important :star:

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews ๐Ÿ˜ฟ

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf
Attempting PDF compilation. Reticulating splines etc...

@whedon generate pdf

Attempting PDF compilation. Reticulating splines etc...

Fixed a couple of typography issues, now it's good to go!

Additional notes relating to my review:

  • I don't think most of the elements added to setup_requires are actually necessary for users (e.g. sphinx, bumpversion, wheel, twine, etc). I would remove those that the user shouldn't actually need. Developers can install those as needed.
  • A few tests currently fail because they're referring to nonexistent files. Based on your documentation, these tests (in test_physics.py) are the ones that are meant to test against external codes. If the outputs are being stored as artifacts directly on your testing servers or something similar, I would suggest using e.g. pytest.mark.skip if not on CI to avoid users seeing a failure.
  • Although they are internal methods, they're doing the bulk of the work it seems like, so I would like to see some documentation of the _generate and _recursive_generate functions for the sake of future developers.

Once these requests are addressed, I can review the actual paper as well.

I'm done with my review. The code does what it says it does. I could run the code, generate decays and check that the 4-momentum vectors and invariant mass of the results made sense. I only wish that functionality to deal with decay widths, along the lines of the helper code in tests/helpers/decays.py, would be part of the main code.

  • I don't think most of the elements added to setup_requires are actually necessary for users (e.g. sphinx, bumpversion, wheel, twine, etc). I would remove those that the user shouldn't actually need. Developers can install those as needed.

I agree, removed

  • A few tests currently fail because they're referring to nonexistent files. Based on your documentation, these tests (in test_physics.py) are the ones that are meant to test against external codes. If the outputs are being stored as artifacts directly on your testing servers or something similar, I would suggest using e.g. pytest.mark.skip if not on CI to avoid users seeing a failure.
  • Although they are internal methods, they're doing the bulk of the work it seems like, so I would like to see some documentation of the _generate and _recursive_generate functions for the sake of future developers.

Fixed the CI (there was a problem with uploading the large files to the CI)

I'm done with my review.

Thanks a lot for that!

I only wish that functionality to deal with decay widths, along the lines of the helper code in tests/helpers/decays.py, would be part of the main code.

This is intentionally not part of the package. phasespace aims to only provide the decay mechanics and leaves other things to other libraries, since there are enough around that are quite suitable for that. E.g. using tensorflow_probability with it's distributions and the particle-package for the particles characteristics works already well. And there is an example also provided in the docs that can be copy-pasted basically, the mechanism is pretty simple.

So from our point of view, the additional maintenance and design effort would not be worth it, instead the focus is to provide a stable interface and the flexibility to compose any function.

Hi @vyasr: How do you feel your requests are progressing?

I am done with my review - sorry it took so long

@apuignav @mayou36 any thoughts regarding my request for documentation of the generation functions? That's my last outstanding request.

@vyasr I think it makes sense, although the documentation is mainly included in the "parent" methods. We'll add docs to clarify better so it's more developer-friendly.

@vyasr Docs have been updated!

Looks good! I'm all set with my review now.

One minor tip, the docstrings are pretty inconsistent with using tensor, Tensor, and tf.Tensor. If you consistently refer to tensorflow.Tensor using the fully qualified name (it might be possible to use aliases, I haven't tried) you can take advantage of Intersphinx to get links to show up in your documentation.

@whedon generate pdf

Attempting PDF compilation. Reticulating splines etc...

The preprint looks good!

@whedon check references

Attempting to check references...

```Reference check summary:

OK DOIs

  • 10.1016/S0168-9002(97)00048-X is OK
  • 10.1016/j.cpc.2017.01.029 is OK
  • 10.5281/zenodo.2602043 is OK
  • 10.5281/zenodo.2591993 is OK

MISSING DOIs

  • None

INVALID DOIs

  • None
    ```

@whedon set 10.5281/zenodo.2591993 as archive

OK. 10.5281/zenodo.2591993 is the archive.

:wave: @openjournals/joss-eics, this submission looks ready to be accepted.

Before continuing, @apuignav, please edit the metadata of the Zenodo archive so the author list and title matches the JOSS paper.

Should be done. I don't know why zenodo deletes all the info whenever a new release happens :-(

Hey! Is there anything missing from our side? I have not been following closely lately.

๐Ÿ‘‹ @apuignav - sorry for the delay - I think you hit a shift change of Associated-Editors-in-Chief.

Can you merge some minor changes in the paper: https://github.com/zfit/phasespace/pull/32 ?

Done! Thanks for the fixes!

@whedon accept

Attempting dry run of processing paper acceptance...

```Reference check summary:

OK DOIs

  • 10.1016/S0168-9002(97)00048-X is OK
  • 10.1016/j.cpc.2017.01.029 is OK
  • 10.5281/zenodo.2602043 is OK
  • 10.5281/zenodo.2591993 is OK

MISSING DOIs

  • None

INVALID DOIs

  • None
    ```

Check final proof :point_right: https://github.com/openjournals/joss-papers/pull/1059

If the paper PDF and Crossref deposit XML look good in https://github.com/openjournals/joss-papers/pull/1059, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.
@whedon accept deposit=true

๐Ÿ‘‹ @arfon - note that we have a problem with the title in the xml, due to the greek character, I assume. Your thoughts?

Hrm, looks like Whedon is stripping this out. Feel free to accept here and I'll update the Crossref metadata manually after you accept.

@whedon accept deposit=true

Doing it live! Attempting automated processing of paper acceptance...

๐Ÿฆ๐Ÿฆ๐Ÿฆ ๐Ÿ‘‰ Tweet for this paper ๐Ÿ‘ˆ ๐Ÿฆ๐Ÿฆ๐Ÿฆ

๐Ÿšจ๐Ÿšจ๐Ÿšจ THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! ๐Ÿšจ๐Ÿšจ๐Ÿšจ

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited :point_right: https://github.com/openjournals/joss-papers/pull/1060
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.01570
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! ๐ŸŽ‰๐ŸŒˆ๐Ÿฆ„๐Ÿ’ƒ๐Ÿ‘ป๐Ÿค˜

    Any issues? notify your editorial technical team...

@arfon - please do your crossref-latex magic :) And then please close this issue when everything seems to be right

Thanks to @mdoucet, @stuartcampbell, @vyasr for reviewing!
And to @poulson for editing!

@arfon - please do your crossref-latex magic :) And then please close this issue when everything seems to be right

OK, fixed up the metadata and opened https://github.com/openjournals/whedon/issues/59 to track this.

:tada::tada::tada: Congratulations on your paper acceptance! :tada::tada::tada:

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.01570/status.svg)](https://doi.org/10.21105/joss.01570)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.01570">
  <img src="https://joss.theoj.org/papers/10.21105/joss.01570/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.01570/status.svg
   :target: https://doi.org/10.21105/joss.01570

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Thank you very much to everybody for the review!

Was this page helpful?
0 / 5 - 0 ratings