Joss-reviews: [REVIEW]: astrosource: automating optical astronomy measurement, calibration and analysis for variable stellar sources from provided photometry

Created on 5 Sep 2020  Â·  19Comments  Â·  Source: openjournals/joss-reviews

Submitting author: @mfitzasp (Michael Fitzgerald)
Repository: https://github.com/zemogle/astrosource/
Version: v1.4.0
Editor: @arfon
Reviewer: @bsipocz, @joshspeagle
Archive: Pending

:warning: JOSS reduced service mode :warning:

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/9332d9711581954baddd586202bbc92b"><img src="https://joss.theoj.org/papers/9332d9711581954baddd586202bbc92b/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/9332d9711581954baddd586202bbc92b/status.svg)](https://joss.theoj.org/papers/9332d9711581954baddd586202bbc92b)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@bsipocz & @joshspeagle, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @arfon know.

✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨

Review checklist for @bsipocz

Conflict of interest

  • [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [ ] Contribution and authorship: Has the submitting author (@mfitzasp) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • [ ] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [ ] Functionality: Have the functional claims of the software been confirmed?
  • [ ] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [ ] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [ ] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [ ] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [ ] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [ ] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • [ ] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [ ] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • [ ] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [ ] State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • [ ] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • [ ] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @joshspeagle

Conflict of interest

  • [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Contribution and authorship: Has the submitting author (@mfitzasp) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
Python TeX review

Most helpful comment

@whedon check repository

All 19 comments

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @bsipocz, @joshspeagle it looks like you're currently assigned to review this paper :tada:.

:warning: JOSS reduced service mode :warning:

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

:star: Important :star:

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1080/21672857.2017.1303264 is OK
- 10.1017/pasa.2014.30 is OK
- 10.1002/asna.201512254 is OK
- 10.3847/1538-3881/153/2/77 is OK
- 10.1088/1538-3873/ab7ee7 is OK
- 10.1017/pasa.2018.5 is OK
- 10.1088/0067-0049/219/1/12 is OK
- 10.32374/atom.2020.1.2 is OK
- 10.32374/atom.2020.1.4 is OK
- 10.1117/12.2314340 is OK
- 10.1086/673168 is OK
- 10.1051/0004-6361/201322068 is OK
- 10.3847/1538-3881/aabc4f is OK
- 10.1109/mcse.2011.37 is OK
- 10.1109/mcse.2007.55 is OK
- 10.3847/1538-3881/aafc33 is OK
- 10.21105/joss.00058 is OK
- 10.32374/atom.2020.1.1 is OK
- 10.1051/0004-6361:20020802 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@bsipocz, @joshspeagle

This is the review thread for the paper. All of our communications will happen here from now on.

Please read the "Reviewer instructions & questions" in the first comment above.

Both reviewers have checklists at the top of this thread (in that first comment) with the JOSS requirements. As you go over the submission, please check any items that you feel have been satisfied. There are also links to the JOSS reviewer guidelines.

The JOSS review is different from most other journals. Our goal is to work with the authors to help them meet our criteria instead of merely passing judgment on the submission. As such, the reviewers are encouraged to submit issues and pull requests on the software repository. When doing so, please mention https://github.com/openjournals/joss-reviews/issues/2641 so that a link is created to this thread (and I can keep an eye on what is happening). Please also feel free to comment and ask questions on this thread. In my experience, it is better to post comments/questions/suggestions as you come across them instead of waiting until you've reviewed the entire package.

We aim for reviews to be completed within about 2-4 weeks. Please let me know if any of you require some more time. We can also use Whedon (our bot) to set automatic reminders if you know you'll be away for a known period of time.

@arfon - in this the place to ask paper related questions (rather than software related)?

E.g. authorship related questions should be discussed here, for example the explicit opt in from authors, or in an issue in the repo? (I would this here is better, but there may be a guideline saying otherwise that I missed). Also, I didn't see any discussion in the guidelines about authors for whom there is no trace of contribution in the software repo. Or should the commit history on be used to determined the lead author has made significant contribution to the software?

These discussions can happen here or in issues in the repo (tagged with this issue to create a link between them).

... active project direction and other forms of non-code contributions are. The authors themselves assume responsibility for deciding who should be credited with co-authorship, and co-authors must always agree to be listed...

@mfitzasp - Since you and @zemogle interacted with the JOSS submission I take it as your agreement to be authors. I don't see any interactions in the repo with the other two authors though, have they been actively agreed to be listed?

Also, I wonder whether you considered adding the third person as author, who contributing to the package, especially early on helping with setting up the package skeleton, CI, etc. Unless they fully opted out, I would appreciate if they were at least mentioned in the acknowledgement.

@whedon check repository

Software report (experimental):

github.com/AlDanial/cloc v 1.84  T=0.19 s (158.9 files/s, 24758.7 lines/s)
-------------------------------------------------------------------------------
Language                     files          blank        comment           code
-------------------------------------------------------------------------------
Python                          18            584            376           2450
TeX                              1             25              0            278
Markdown                         2            114              0            191
reStructuredText                 3            101             21            170
Jupyter Notebook                 1              0            195             65
YAML                             2              3              2             40
DOS Batch                        1              8              1             26
make                             1              4              7              9
TOML                             1              0              0              3
-------------------------------------------------------------------------------
SUM:                            30            839            602           3232
-------------------------------------------------------------------------------


Statistical information for the repository '2641' was gathered on 2020/09/10.
The following historical commit information, by author, was found:

Author                     Commits    Insertions      Deletions    % of changes
Edward Gomez                    97          8541           5484           76.81
Joe Singleton                    3          1799           1765           19.52
Michael Fitzgerald              30           501            169            3.67

Below are the number of rows from each author that have survived and are still
intact in the current revision:

Author                     Rows      Stability          Age       % in comments
Edward Gomez               2913           34.1         10.0                7.83
Joe Singleton                69            3.8         14.0                1.45
Michael Fitzgerald          428           85.4          1.1                9.58

That's a really good call. @joesingo would you like to be an author? You helped substantially with the tom_astrosource aspect as well as setting up RTD and Travis.

The last 2 authors provided development input before I was involved, and before the code was in GitHub.

I've added @joesingo to the author list @bsipocz

:wave: @bsipocz & @joshspeagle - just wanted to check in to see how you're getting along with your reviews here?

@arfon - I got a bit swamped and dropped the ball, will try to get back to it over the weekend.

Same situation here — I’ve been absolutely swamped but will get to it this weekend. Apologies for the delay.

This will unfortunately be delayed even further on my end. I've recently moved internationally and, due to technical issues and quarantine requirements, it's taking longer to get internet set up at our new place than we had expected. I'm hopeful that this will be resolved in the next few days.

Sorry about the extensive delays getting this referee report in.

👋 @bsipocz & @joshspeagle - any chance you could try and complete your reviews in the next week or so?

My review will be done by tomorrow. Thanks in advance to everyone for your patience.

Review is done! Comments below:

  • Documentation is functional enough to meet JOSS requirements I think, but definitely not super thorough. Docs seem more like they were written more as a reference and less as a guide for new users. There's also limited examples of the internal Python API that the CLI runs off of (a stated feature of the package), although there's enough stuff online and in the codebase for someone to figure it out by poking around. Adding those in would be helpful but not required.
  • Automatic tests are helpful, although figuring out the outputs is nontrivial even with the explanations. Additional explicit guidance working through an example would help enormously here.
  • No issues with the paper other than the figure appears to be mislabeled in the submitted PDF (it's "??" for me).
  • While not a requirement for acceptance, I would encourage the author(s) to try and add more examples and documentation to make the code easier to work through. In particular, I think it's especially important to have worked-through cases illustrating for users how exactly statements like this actually work in the code: "Solutions to avoid these issues have been easily incorporated into the astrosource code, but would likely be cumbersome to explain manually in any given observer guide."

As most of the above issues are recommendations and not requirements, I'm happy to recommend the submission be accepted.

Thanks @joshspeagle Those are really constructive comments. The docs are an on-going effort but giving worked through examples are a good idea.

I've fixed up the figure references (and the missing captions).

Was this page helpful?
0 / 5 - 0 ratings