Joss-reviews: [REVIEW]: StatAid: a R package with a graphical user interface for data analysis

Created on 2 Sep 2020  Β·  79Comments  Β·  Source: openjournals/joss-reviews

Submitting author: @VincentAlcazer (Vincent Alcazer)
Repository: https://github.com/VincentAlcazer/StatAid
Version: v1.1.2
Editor: @mikldk
Reviewer: @nistara, @adithirgis
Archive: 10.5281/zenodo.4152933

:warning: JOSS reduced service mode :warning:

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

Status

status

Status badge code:

HTML: <a href="https://joss.theoj.org/papers/ae17f4e76e8559c635d41dfe9405da73"><img src="https://joss.theoj.org/papers/ae17f4e76e8559c635d41dfe9405da73/status.svg"></a>
Markdown: [![status](https://joss.theoj.org/papers/ae17f4e76e8559c635d41dfe9405da73/status.svg)](https://joss.theoj.org/papers/ae17f4e76e8559c635d41dfe9405da73)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) by leaving comments in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@nistara & @adithirgis, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. Any questions/concerns please let @mikldk know.

✨ Please start on your review when you are able, and be sure to complete your review in the next six weeks, at the very latest ✨

Review checklist for @nistara

Conflict of interest

  • [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Contribution and authorship: Has the submitting author (@VincentAlcazer) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?

Review checklist for @adithirgis

Conflict of interest

  • [x] I confirm that I have read the JOSS conflict of interest (COI) policy and that: I have no COIs with reviewing this work or that any perceived COIs have been waived by JOSS for the purpose of this review.

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Contribution and authorship: Has the submitting author (@VincentAlcazer) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?
  • [x] Substantial scholarly effort: Does this submission meet the scope eligibility described in the JOSS guidelines

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the functionality of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Summary: Has a clear description of the high-level functionality and purpose of the software for a diverse, non-specialist audience been provided?
  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] State of the field: Do the authors describe how this software compares to other commonly-used packages?
  • [x] Quality of writing: Is the paper well written (i.e., it does not require editing for structure, language, or writing quality)?
  • [x] References: Is the list of references complete, and is everything cited appropriately that should be cited (e.g., papers, datasets, software)? Do references in the text use the proper citation syntax?
HTML R TeX accepted published recommend-accept review

Most helpful comment

@nistara, @adithirgis - many thanks for your reviews here and to @mikldk for editing this submission ✨

@VincentAlcazer - your paper is now accepted into JOSS :zap::rocket::boom:

All 79 comments

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @nistara, @adithirgis it looks like you're currently assigned to review this paper :tada:.

:warning: JOSS reduced service mode :warning:

Due to the challenges of the COVID-19 pandemic, JOSS is currently operating in a "reduced service mode". You can read more about what that means in our blog post.

:star: Important :star:

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands

For example, to regenerate the paper pdf after making changes in the paper's md or bib files, type:

@whedon generate pdf

@nistara, @adithirgis: Thanks for agreeing to review. Please carry out your review in this issue by updating the checklist above and giving feedback in this issue. The reviewer guidelines are available here: https://joss.readthedocs.io/en/latest/reviewer_guidelines.html. If possible create issues (and cross-reference) in the submission's repository to avoid too specific discussions in this review thread.

If you have any questions or concerns please let me know.

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.18637/jss.v014.i09 is OK

MISSING DOIs

- None

INVALID DOIs

- None

@whedon commands

Here are some things you can ask me to do:

# List Whedon's capabilities
@whedon commands

# List of editor GitHub usernames
@whedon list editors

# List of reviewers together with programming language preferences and domain expertise
@whedon list reviewers

EDITORIAL TASKS

# Compile the paper
@whedon generate pdf

# Compile the paper from alternative branch
@whedon generate pdf from branch custom-branch-name

# Ask Whedon to check the references for missing DOIs
@whedon check references

# Ask Whedon to check repository statistics for the submitted software
@whedon check repository

Started review here.

@whedon check references

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.18637/jss.v014.i09 is OK

MISSING DOIs

- None

INVALID DOIs

- None

Review process started here.

@VincentAlcazer, can you give a brief status on the review issues raised in the separate threads by @nistara and @adithirgis?

Dear @mikldk,
I am currently working on the issues raised by the reviewers and their comments are really helping me improving the software.
Due to the pandemic evolution in France, I am also quite busy at the hospital with only few moments to work on StatAid. However, I already corrected the main issues reported and I hope I'll be able to provide my complete answer soon.
I am very sorry for this delay and will do my best to proceed through the reviewing in time.
Best regards,

@VincentAlcazer No need to be sorry - take your time. I just wanted to check in, also for the reviewers' sake. Be safe.

Hi @VincentAlcazer, please take your time, we're in no hurry. Thanks for your important work right now during the pandemic.

Hi @VincentAlcazer, I totally agree with @nistara and @mikldk, please take your time. And please stay safe.
Take care.

@whedon generate pdf

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

Dear @mikldk,

I have completed my review process. Thanks again for inviting me to review.

Thanks @VincentAlcazer for the wonderful application. This is a really interesting and useful software!

Regards and Stay Safe
Adithi

@whedon generate pdf

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

Dear @mikldk and @VincentAlcazer ,

I'm done with my review. Thank you for inviting me as a reviewer. This package is pretty awesome and should help many people explore various descriptive and statistical tools and visualizations with one tool!! I also appreciate the careful consideration give to what potential users might need to customize their analysis and plots.

Best wishes,
Nistara

Dear @mikldk ,

Thank you for giving me the opportunity to have my software reviewed by JOSS. I would like to thanks @nistara and @adithirgis who did an amazing job thoroughly reviewing StatAid and really helped me to improve the software.

Please let me know if there is anything else I should do now for the following submission process.

Best regards,

Vincent Alcazer

@whedon check references

Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.18637/jss.v014.i09 is OK

MISSING DOIs

- None

INVALID DOIs

- None

Thanks for @nistara and @adithirgis for their reviews.

@VincentAlcazer :

  • How does this software compare to iNZight?
  • Please have a final read though of the paper, checking language etc.
  • Have a final check of the proofs with @whedon generate pdf
  • Please make a tagged release and archive (e.g. with Zenodo) as described here, and report the version number and archive DOI in this thread. Please verify that the archive deposit has the correct metadata (title and author list), or edit these if that is not the case.

@whedon generate pdf

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

Dear @mikldk ,

I did not know iNZight which seems to be a pretty good software developed by a serious team. This software is however different from StatAid in multiple points:

  • StatAid is more user-friendly, giving the possibility to quickly plot publication-ready graph or tables. On the contrary, it offers less options than iNZight which provides a more complete environment to modify data and recode variables for example. The absence of data/variable modification in StatAid was a choice, allowing the implementation of variable check and controls before all tests and analyses. Hence, StatAid is pushing the author to understand its variables (the differences between numeric, categorical, time-dependent) and how its should be encoded so that StatAid recognize the good type of variable.
  • StatAid is really guiding the user for its data analysis. In that way, StatAid will make some choices and provide the method used (for example, StatAid will automatically choose the most appropriate statistics/test to use for some type of comparisons). Related to automatic variable detection, the software will help the user preventing bad test use/bad graph choice.
  • StatAid is providing graph customization solutions (such as famous journal colors palette (JCO, nature, NEJM…) or full graph legend modification option) which not seems to be implemented in iNZight.
  • Compared to iNZight, StatAid offer the possibility to perform survival analysis (Kaplan-meir curves, uni and multivariate cox analysis). Conversely, compared to StatAid, iNZight offer the possibility to perform modelisation with multiple responses.
  • StatAid is completely open-source and offer the possibility for users either to ask for a feature implementation or to provide an amelioration. This open-source aspect is also a security for people working with sensitive data (e.g. data from clinical trials / patients).
  • StatAid can be used online on shyniapps.io, which is an important feature for conditions where the user can’t install external software on the computer (which is the case in many hospitals for example).

All-together, I would say that StatAid is more convenient for quickly accessing publication-ready graph and table, really guiding the authors through their data analysis. iNZight seems to have a more complete solution (with data modification and recoding possibilities), to the cost of less user-friendly / guiding interface. Only StatAid is providing a time-dependent variable analysis option and has this evolving / contributive aspect.

I checked the paper.md and everything seems OK. I uploaded the 1.1 version on zenodo with the following DOI:

10.5281/zenodo.4081353

Thank you.

Best regard,

Vincent Alcazer

@VincentAlcazer: I think the paper would benefit from including some of these considerations. What do you think, @nistara and @adithirgis?

Hi @mikldk and @VincentAlcazer,

I think it would good to add the strong points (for the benefit of the users) in the paper.

I agree as well...these points provide a contrast for StatAid and also highlight some of its salient features. In addition to options for different statistical tests, I particularly liked the visualization options and the choice of different color palettes. A lot of consideration should go into analyses in general, and there could be models more suitable than those provided at the moment (e.g. mixed models), but I can see lots of people, including students, benefiting from exploring their data with this software and using it for their work. So yes, including the above considerations would definitely benefit the paper and the readers in understanding what the software offers.

Dear @mikldk @nistara @adithirgis ,

Thank you again for your messages and suggestions.

The paper.md has been updated taking into account these considerations:

Addition in the summary: "Other free software exist such as iNZight or Jamovi. However, while providing solutions with multiple features such as variable recoding, these software do not guide the user through the analysis and can lack some key features such as time-dependent outcome analysis."

In the Statement of need: "Compared to other free similar software, StatAid has been designed to quickly produce publication-ready graph and table by really guiding the user through their data analysis and providing multiple graph customization options. By limiting the number of choices and integrating different check and variable controls, StatAid hence help the user to prevent bad test use or bad graph choice. Besides, as an evolving software, only StatAid is providing the possibility for users to ask for the implementation of a particular feature or to contribute to the software development. Its open-source aspect can also be seen as a security for people working with sensitive data (e.g. data from clinical trials / patients). The online version of StatAid renders it accessible everywhere, even on computers with restrictive software installation policies such as hospitals or research centers."

Please let me know if you have anything else in mind to improve the paper/software.

Best regards,

Vincent

I think this is summarised well.
I had a small suggestion if everyone else agrees (and also if I am not wrong :))

In the Statement of need: "Compared to other free similar software, StatAid has been designed to quickly produce publication-ready graph and table by ~really~ guiding the user through their data analysis and providing multiple graph customization options. By limiting the number of choices and integrating different check and variable controls, StatAid ~hence~ _helps_ the user to prevent bad test use or bad graph choice. Besides, as an evolving software, ~only~ StatAid ~is~ _also provides_ the possibility for users to ~ask~ _request_ for the implementation of a particular feature or to contribute to the software development. Its open-source aspect can also be seen as a security for people working with sensitive data (e.g. data from clinical trials / patients). _The online version of StatAid renders it accessible everywhere, even on computers with restrictive software installation policies such as hospitals or research centers._ (Can this sentence be modified a little?)"

@whedon generate pdf

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

Dear @mikldk @nistara @adithirgis ,

The paper.md has been updated with @adithirgis 's suggestions.

Best regards,

Vincent

@nistara, @adithirgis: Can you confirm that you have finished the review and recommend that this paper is now published?

@VincentAlcazer:

  • In the paper you have "Compared to other free similar software, StatAid has been designed to quickly produce publication-ready graph and table by guiding the user through their data analysis and providing multiple graph customization options."

    • To me it sounds more correct to write "publication-ready graphs and tables", i.e. in plural, but I am not a native English speaker. Anyone that knows the correct form?

  • Please have a final read though of the paper, checking language etc.
  • Have a final check of the proofs with @whedon generate pdf
  • Please make a tagged release and archive (e.g. with Zenodo) as described here, and report the version number and archive DOI in this thread. Please verify that the archive deposit has the correct metadata (title and author list), or edit these if that is not the case.

Hi all!

I confirm that I have reviewed the paper and I recommend to publish this. All the best @VincentAlcazer! StatAid is an amazing and useful software.

Though I am not a good English language speaker/ user, but I do feel that the first point of plural usage seems correct.

Thanks so much again for inviting me!

Regards and Stay Safe
Adithi

@whedon generate pdf

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

Hi @mikldk, @VincentAlcazer, and @adithirgis

I had previously submitted a pull request for typos and minor issues of the like in the paper/software (https://github.com/VincentAlcazer/StatAid/commit/d0afe60a93836b246158fe15b35f81567d1b8fc8), however, I am not sure if I can make extensive grammatical edits as a reviewer. I do realize/respect that as multilingual people we might require some assistance in one or the other language, and so if it's acceptable, I conducted some extra proofreading and can create another pull request for it: https://github.com/nistara/StatAid/commit/85fe82e6da936b829988c61a747db3865766c115

If it's fine by you, I would recommend this paper once the above edits have been incorporated.

Regards,
Nistara

Dear @nistara ,

Thank you for this complete proofread of the paper. If it's fine for everybody, I am waiting your pull request to integrate it and make the new Zenodo archive.

Best regards,

Vincent

@nistara @VincentAlcazer That sounds like a good plan. Thanks. Please ping me once the PR has been merged and the archive updated.

@whedon generate pdf

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

Dear @mikldk ,

The paper has been completely reviewed and the software archive updated on zenodo (StatAid v1.1.0 - DOI 10.5281/zenodo.4146739).

I would like to thank again the reviewers @nistara and @adithirgis for their amazing work and the time they have taken to help me improving both the software and the paper.

Best regards,

Vincent Alcazer

@whedon set v1.1.0 as version

OK. v1.1.0 is the version.

@whedon set 10.5281/zenodo.4146739 as archive

OK. 10.5281/zenodo.4146739 is the archive.

@VincentAlcazer There seem to be an issue with the archive. The file available for download is named 1.1 (not 1.1.0), and at the repo there is no 1.1.0 release (https://github.com/VincentAlcazer/StatAid/releases).

@mikldk the version has been corrected on github. The archive available on Zenodo is also the good version with the updated paper, only the name has not been changed (because the software itself was not changed).

Would you prefer that I upload a new version on Zenodo with the renamed archive (1.1.0 intead of 1.1)?

@mikldk the version has been corrected on github. The archive available on Zenodo is also the good version with the updated paper, only the name has not been changed (because the software itself was not changed).

Would you prefer that I upload a new version on Zenodo with the renamed archive (1.1.0 intead of 1.1)?

At https://joss.readthedocs.io/en/latest/submitting.html you can read that "Upon successful completion of the review, authors will make a tagged release of the software, and deposit a copy of the repository with a data-archiving service such as Zenodo [...]".

So yes, the version in the paper (1.1.0) must have a tagged release of same version that is also available at Zenodo. All same version. You can integrate Zenodo and Github such that Github releases automatically get a Zenodo release.

@mikldk : Sorry for this inconvenience. The Zenodo archive has been linked with github and published with the updated name (DOI: 10.5281/zenodo.4146817).

@whedon set 10.5281/zenodo.4146817 as archive

OK. 10.5281/zenodo.4146817 is the archive.

@VincentAlcazer It looks like you have folders such as .Rproj.user in your Git. Please remove it by creating a .gitignore file. Check other folders, too - I am not sure about e.g. rsconnect.

@mikldk it is strange as the .gitignore file already included .Rproj.user. I manually deleted the folders and added rsconnect exclusion in the .gitignore file. It seems to work correctly now.

I had to push a new version to update the Github and Zenodo releases:
version: 1.1.01
DOI: 10.5281/zenodo.4147144

@whedon set 10.5281/zenodo.4147144 as archive

OK. 10.5281/zenodo.4147144 is the archive.

@VincentAlcazer I apologise for being pedantic, but 1.1.01 is not a valid semantic version number, which I strongly suggest to stick to. See https://semver.org/ under Β§2 ("A normal version number MUST take the form X.Y.Z where X, Y, and Z are non-negative integers, and MUST NOT contain leading zeroes.").

I would suggest calling it version 1.1.1 instead.

@mikldk I understand and sorry for this fastidious release. In fact I chose 1.1.01 due to Zenodo conflicts for 1.1.1.

I updated the version (and had to change the number for Zenodo):
New version: 1.1.2
10.5281/zenodo.4152933

Thank you again for your time.

Best regards,

Vincent Alcazer

@whedon set 10.5281/zenodo.4152933 as archive

OK. 10.5281/zenodo.4152933 is the archive.

@whedon set v1.1.2 as version

OK. v1.1.2 is the version.

@whedon generate pdf

:point_right::page_facing_up: Download article proof :page_facing_up: View article proof on GitHub :page_facing_up: :point_left:

@whedon accept

Attempting dry run of processing paper acceptance...
Reference check summary (note 'MISSING' DOIs are suggestions that need verification):

OK DOIs

- 10.18637/jss.v014.i09 is OK

MISSING DOIs

- None

INVALID DOIs

- None

:wave: @openjournals/joss-eics, this paper is ready to be accepted and published.

Check final proof :point_right: https://github.com/openjournals/joss-papers/pull/1878

If the paper PDF and Crossref deposit XML look good in https://github.com/openjournals/joss-papers/pull/1878, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.
@whedon accept deposit=true

@nistara, @adithirgis Thank you very much for your effort in reviewing this paper!

@whedon accept deposit=true

Doing it live! Attempting automated processing of paper acceptance...

🐦🐦🐦 πŸ‘‰ Tweet for this paper πŸ‘ˆ 🐦🐦🐦

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited :point_right: https://github.com/openjournals/joss-papers/pull/1881
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.02630
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! πŸŽ‰πŸŒˆπŸ¦„πŸ’ƒπŸ‘»πŸ€˜

    Any issues? Notify your editorial technical team...

@nistara, @adithirgis - many thanks for your reviews here and to @mikldk for editing this submission ✨

@VincentAlcazer - your paper is now accepted into JOSS :zap::rocket::boom:

:tada::tada::tada: Congratulations on your paper acceptance! :tada::tada::tada:

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.02630/status.svg)](https://doi.org/10.21105/joss.02630)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.02630">
  <img src="https://joss.theoj.org/papers/10.21105/joss.02630/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.02630/status.svg
   :target: https://doi.org/10.21105/joss.02630

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Was this page helpful?
0 / 5 - 0 ratings