Joss-reviews: [REVIEW]: pyveg: A Python package for analysing the time evolution of patterned vegetation using Google Earth Engine

Created on 17 Jul 2020  ยท  63Comments  ยท  Source: openjournals/joss-reviews

Submitting author: @samvanstroud (Samuel Van Stroud)
Repository: https://github.com/alan-turing-institute/monitoring-ecosystem-resilience
Version: 1.1.0
Editor: @usethedata
Reviewers: @arbennett , @usethedata
Archive: 10.5281/zenodo.4281273

:warning: JOSS reduced service mode :warning:

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/929c21011e8fbcaeca6b1255fdf04d5d"><img src="https://joss.theoj.org/papers/929c21011e8fbcaeca6b1255fdf04d5d/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/929c21011e8fbcaeca6b1255fdf04d5d/status.svg)](https://joss.theoj.org/papers/929c21011e8fbcaeca6b1255fdf04d5d)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@arbennett , please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @usethedata know.

โœจ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest โœจ

Review checklist for @arbennett

Conflict of interest

  • [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Contribution and authorship: Has the submitting author (@samvanstroud) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @usethedata

Conflict of interest

  • [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Contribution and authorship: Has the submitting author (@samvanstroud) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
Jupyter Notebook Python Shell accepted published recommend-accept review

Most helpful comment

๐Ÿ‘‹Hi @arbennett , thank you for your comments and issues, they are very relevant and some where already in our to-do list! @nbarlowATI and I will be working through these issues in the next couple of weeks and keep you updated.

All 63 comments

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @arbennett it looks like you're currently assigned to review this paper :tada:.

:warning: JOSS reduced service mode :warning:

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

:star: Important :star:

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews ๐Ÿ˜ฟ

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf
Reference check summary:

OK DOIs

- 10.1098/rsos.160443 is OK
- 10.1111/gcb.14059 is OK
- 10.1073/pnas.0802430105 is OK
- 10.1103/PhysRevE.71.056103 is OK

MISSING DOIs

- None

INVALID DOIs

- None

๐Ÿ‘‹ @arbennett -- any update on your review?

@usethedata thanks for the reminder! I am working on the review now!

:wave: Hi @samvanstroud - thanks for developing pyveg, I think it sounds like a very nice tool for processing GEE vegetation data. However, in reviewing this I came across a few difficulties that I think need to be addressed before accepting this for publication. I have opened a few issues on the monitoring-ecosystem-resilience for organization. Here is a list:

๐Ÿ‘‹Hi @arbennett , thank you for your comments and issues, they are very relevant and some where already in our to-do list! @nbarlowATI and I will be working through these issues in the next couple of weeks and keep you updated.

@crangelsmith -- Thanks for the comment. I'm thinking I'll do the second review, as I'd like to dig into this work more carefully than I have so far, as it relates to some of what I do in my day job.

@whedon add @usethedata as reviewer

OK, @usethedata is now a reviewer

hi ๐Ÿ‘‹ โ€“ @arbennett , @usethedata: it looks like this review needs attention? The last movement is from over six weeks ago. Can you give us an update on your timeline for this? Thanks!

hi @labarba - I submitted my review on July 27 with some outstanding issues in being able to run some of the code. It looks like there has been recent development on the repo, but I haven't been updated on whether we should give another round of reviews.

hi @labarba @arbennett, sorry for the silence. We have been working hard on the project, implementing the review comments but also in general development in view for a science paper we are also aiming to publish.

We are planning to do a PR to master this week, which will include the implementation of the comments from @arbennett. But also, the code has evolved a bit, and we would like to add a couple of more authors to the paper, to acknowledge the work some new members have done in the last couple of months. Is it ok to update the paper with this new info (authors + a couple of sentences about the new functionalities)?

@crangelsmith Sorry to have been absent, but end of fiscal year stuff was pretty hectic. I would say you should go ahead and revise the paper. Let me take a look at the changes in the code from what @arbennett has already review and decide how best to proceed with the review.

@crangelsmith Just checking -- what is the timeline for your updates? I see that you've merged the issue that's related to @arbennett comments, but it looks like the paper hasn't been updated. No worries, just asking about the timeline.

Hi @usethedata, sorry for the slow updates. Currently we are in a transition moment for this project, the RSE funded time has finished (that is me, @nbarlowATI and @samvanstroud) and we have handed over the finalisation of the project to the researchers in Exeter (@jbuxt, @caboulton and @jabrams23). We will all keep working on getting this paper published but at different capacities.

I think the new version of the paper will be merged into master imminently, but I will let @jbuxt confirm.

Hello @usethedata apologies for the delay in merging the paper into the master. It should all be up to date now, please do let me know if there are any comments/questions.

@arbennett -- can you go through and complete your review?

@jbuxt @crangelsmith -- I expect to work through my review over the weekend. Normally, I don't review papers I edit, but this one is of particular interest to my day job. I'll be working on this over the weekend.

@usethedata - yep I'll complete my review over the weekend as well!

@whedon generate pdf

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

I've re-looked at things and several of my initial issues have been solved, but it looks like work hasn't quite completed on all of them. The updated documentation looks really great and is very in depth - thank you for that! It seems the tutorials and examples still haven't quite been updated to run through smoothly. From what I can see this is still a known issue, as there are open issues on these topic, so I'm leaving these unchecked in my initial review.

On the paper itself, I think it explains the motivation and overall structure of pyveg quite well, but as someone who does not work on ecosystem resilience I'm not quite ready to check the 'State of the Field' box. I think it would be really helpful to detail any previous tools or initiatives for monitoring ecosystem resiliency. If there really haven't been any, I think that's also worth stating.

I also have one minor point on the referencing that I think would be nice to update, the part on the Crystal Ball distribution. Instead of linking to the Wikipedia page here, is there a reference where this distribution has been used for this purpose? That would tie in better, and give more of a springboard for interested parties.

@arbennett I do work in and around ecosystem resiliency and work with others who do. I'll take a particular look at that part.

Minor issue with paper bibliography case. Fix proposed in https://github.com/alan-turing-institute/monitoring-ecosystem-resilience/pull/481 And thanks to @danielskatz for teaching me how to fix this. I'm comfortable checking off the references element when the authors review my PR and accept it or make the functionally equivalent changes.

An observation for the team @crangelsmith @samvanstroud is that the Azure messages feel annoying and are inconsistent with the structure of the other output. For example, getting this output from the test run:

    azure_config.py not found - this is needed for using Azure storage or batch.
    Copy pyveg/azure_config_template.py to pyveg/azure_config.py then input your
    own values for Azure Storage account name and Access key, then redo `pip install .`


    azure_config.py not found - this is needed for using Azure storage or batch.
    Copy pyveg/azure_config_template.py to pyveg/azure_config.py then input your
    own values for Azure Storage account name and Access key, then redo `pip install .`

2020-10-25 14:36:31,897 [INFO] Sentinel2: setting collection_name to COPERNICUS/S2
2020-10-25 14:36:31,897 [INFO] Sentinel2: setting data_type to vegetation

getting the azure_config.py message twice, and not as a (for example) [INFO] message feels off. For that matter, why not have the default install be something that includes a azure_config.py file that says "I don't have Azure access and don't bug me about it" and then provide the instructions for people to change that. Or, at least, why pester the user about that file being missing? It's a bit of a small thing, but in an Open Source context, nagging users about not using a commercial, paid-for service seems a bit of a blemish on some really great work. Supporting Azure is great, and no worries about that. It's just the default behavior that feels a bit off.

Regarding the state of the field question @arbennett raises: I've read the section of the paper carefully and consulted with a couple of colleagues on the question. The short answer is that I'm comfortable with this as presented.

The longer answer is that the ecosystem resilience is an application area as far as this paper is concerned. The paper does not have an exhaustive list of references on the topic, but that's also not necessary in my view (as a still relatively new JOSS topic editor). They point to a couple of papers which my colleagues felt were reasonably representative of the subject and provide sufficient summary of the are for a reader interested in that particular application area. This is a paper about the software, and would support, for example a paper in the Journal of Remote Sensing or other venue to discuss more of the science.

I also considered the question from the state of the field perspective for the algorithms presented, and I'm also comfortable that this was done at a level consistent with other JOSS papers that I went out to look at. So, I'm comfortable checking that box from my perspective. @arbennett -- does that address your question?

@arbennett and @usethedata Thank you both for reviewing this over the weekend and for providing constructive and helpful comments.

I believe that @crangelsmith has approved the pull request to update the biblography and I will improve the Crystal ball reference. We will discuss as a group how best to address the rest of the comments in order to make some improvements.

@usethedata - thanks for checking that. I'm happy to check the box then given your perspective. I'm also pretty new to reviewing for JOSS so I'm not always certain on the exact criteria. This is helpful though, thanks!

@whedon generate pdf

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

@arbennett -- what's left from your perspective on the review? Thanks.

@jbuxt -- as a clarification to my earlier comments (about the Azure message). If there's a simple fix for that issue, then great, and I'd love to see it addressed before publication. However, I do not consider it a blocker from my perspective as a reviewer. It is a suggestion for the team to consider.

@usethedata I think it is a good point about the Azure message being a bit "nag"y, when as you say, it is entirely optional whether the user would want to use Azure. We will silence this (probably relegate it to an ERROR if a function that actually does use Azure is called) in the next update, in the next couple of days.

I think the paper itself is in good condition but still wasn't able to successfully run through the tutorials via binder. It seems issue I submitted is still open, so I think that's still being worked out.

@crangelsmith @nbarlowATI I believe @arbennett is referring to https://github.com/alan-turing-institute/monitoring-ecosystem-resilience/issues/367, which is still showing open, but I see a note in that issue that it was moved to Done on September 18th. But, given that @arbennett commented on seeing this issue recently, it doesn't feel closed. Can you help me understand what's going on? Thanks much.

Hi @usethedata I believe that this particular notebook / cell that @arbennett referred to in alan-turing-institute/monitoring-ecosystem-resilience#367 was fixed back then (that notebook has been completely rewritten, and the problem of download sizes being exceeded is fixed). I just tested the latest version on Binder and it does now run through (although the cell that runs the pipeline is extremely slow - I guess Binder (understandably) don't provide super-fast VMs!)

There were some problems in the other tutorials notebooks though, which I believe may have been what @arbennet was referring to in the recent comment. These should hopefully also be fixed now in the latest version pushed to master today.

:+1: Thanks @nbarlowATI - I'll try this out again later this week!

I tried out the updated tutorial, and things ran through, though it took a bit of time. Still, that was my only remaining stumbling block, so I would be happy to accept this submission!

@arbennett excellent. If you'd be so kind as to mark your items as complete in the review checklist, that would be great.

@whedon generate pdf

@whedon check references

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

@whedon check references

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1098/rsos.160443 is OK
- 10.1111/gcb.14059 is OK
- 10.1073/pnas.0802430105 is OK
- 10.1103/PhysRevE.71.056103 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@samvanstroud (and team):
1) Looking at the repository, I'm seeing version 1.0 as the current release, with a date of March 2020. I don't know if you want to tag at 1.0.1 or something similar to note the changes made since this review started. Please confirm what you want for the version number for the paper and either let me know you're going with 1.0 or set up the new release/tag and let me know that.
2) When you have that, please create a deposit at Zenodo and post back here with the DOI that they give you.

Regards,
Bruce

I confirm I have no further editorial changes to make to the paper itself.

Hi @usethedata,

  1. Looking at the repository, I'm seeing version 1.0 as the current release, with a date of March 2020. I don't know if you want to tag at 1.0.1 or something similar to note the changes made since this review started. Please confirm what you want for the version number for the paper and either let me know you're going with 1.0 or set up the new release/tag and let me know that.

Yes, we decided to move to version v1.1.0, this is now tagged and updated on the repo.

  1. When you have that, please create a deposit at Zenodo and post back here with the DOI that they give you.

Done, this is the DOI they provide: 10.5281/zenodo.4281273

Thank you and @arbennett for all your work reviewing this paper!!!

@whedon set 1.1.0 as version

OK. 1.1.0 is the version.

@whedon set 10.5281/zenodo.4281273 as archive

OK. 10.5281/zenodo.4281273 is the archive.

@whedon generate pdf

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

@whedon accept

Attempting dry run of processing paper acceptance...

:wave: @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof :point_right: https://github.com/openjournals/joss-papers/pull/1931

If the paper PDF and Crossref deposit XML look good in https://github.com/openjournals/joss-papers/pull/1931, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.
@whedon accept deposit=true

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.1098/rsos.160443 is OK
- 10.1111/gcb.14059 is OK
- 10.1073/pnas.0802430105 is OK
- 10.1103/PhysRevE.71.056103 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon accept deposit=true

Doing it live! Attempting automated processing of paper acceptance...

๐Ÿฆ๐Ÿฆ๐Ÿฆ ๐Ÿ‘‰ Tweet for this paper ๐Ÿ‘ˆ ๐Ÿฆ๐Ÿฆ๐Ÿฆ

๐Ÿšจ๐Ÿšจ๐Ÿšจ THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! ๐Ÿšจ๐Ÿšจ๐Ÿšจ

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited :point_right: https://github.com/openjournals/joss-papers/pull/1932
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.02483
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! ๐ŸŽ‰๐ŸŒˆ๐Ÿฆ„๐Ÿ’ƒ๐Ÿ‘ป๐Ÿค˜

    Any issues? Notify your editorial technical team...

Congratulations to @samvanstroud (Samuel Van Stroud) and co-authors!!

And thanks to @arbennett for reviewing, and @usethedata for reviewing/editing!

:tada::tada::tada: Congratulations on your paper acceptance! :tada::tada::tada:

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.02483/status.svg)](https://doi.org/10.21105/joss.02483)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.02483">
  <img src="https://joss.theoj.org/papers/10.21105/joss.02483/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.02483/status.svg
   :target: https://doi.org/10.21105/joss.02483

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Was this page helpful?
0 / 5 - 0 ratings