Joss-reviews: [REVIEW]: Genotify: Fast, lightweight gene lookup and summarization

Created on 8 Aug 2018  Â·  26Comments  Â·  Source: openjournals/joss-reviews

Submitting author: @j-andrews7 (Jared Andrews)
Repository: https://github.com/j-andrews7/Genotify
Version: v1.2.0
Editor: @pjotrp
Reviewer: @serine
Archive: 10.5281/zenodo.1345663

Status

status

Status badge code:

HTML: <a href="http://joss.theoj.org/papers/698f9aea23175978e15c4befa2b5f1a1"><img src="http://joss.theoj.org/papers/698f9aea23175978e15c4befa2b5f1a1/status.svg"></a>
Markdown: [![status](http://joss.theoj.org/papers/698f9aea23175978e15c4befa2b5f1a1/status.svg)](http://joss.theoj.org/papers/698f9aea23175978e15c4befa2b5f1a1)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@serine, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.theoj.org/about#reviewer_guidelines. Any questions/concerns please let @pjotrp know.

✨ Please try and complete your review in the next two weeks ✨

Review checklist for @serine

Conflict of interest

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Version: Does the release version given match the GitHub release (v1.2.0)?
  • [x] Authorship: Has the submitting author (@j-andrews7) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [ ] Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Authors: Does the paper.md file include a list of authors with their affiliations?
  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?
accepted published recommend-accept review

Most helpful comment

@serine - many thanks for your review here and to @pjotrp for editing this submission ✨

@j-andrews7 - your paper is now accepted into JOSS and your DOI is https://doi.org/10.21105/joss.00885 :zap: :rocket: :boom:

All 26 comments

Hello human, I'm @whedon. I'm here to help you with some common editorial tasks. @serine it looks like you're currently assigned as the reviewer for this paper :tada:.

:star: Important :star:

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands
Attempting PDF compilation. Reticulating splines etc...

@j-andrews7 to expedite the review process do you mind going through above list of check boxes and make sure they can be ticked (you can't tick them). Also check the PDF output carefully. Ping us here when you are done.

@pjotrp I can't tick them, but they are all unticked currently. The PDF looks fine to me.

You mean all checkboxes pass your scrutiny?

@pjotrp Oh, sorry, I misunderstood. Yes, all of the checkboxes are fine. Given that it is a GUI-based application, I don't include automated tests, but it is quite easy to test manually.

@serine you can start review. Please read the reviewer guidelines above.

@j-andrews7 I'll try to play with genotify over the next few days. I'll have more time next week.

In terms of the documentation. Do you think you can write somethings up? Like perhaps "typical" usage or an example of how you've used in the past? I understand that this is an aggregate of diff tools plus some extra bits and is for explanatory purpose, but you obviously have a particular use case, if you can share that "workflow". I think that would be useful, as well as strengthen the argument for the tool.

These are the actual dot points that I need to tick off.

  • Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Also note that A statement of need comes up twice on my checklist. under docs and under software paper. I ticked off the one under software paper, but haven't yet ticked off the one under the docs. I think this a minor thing, but all kind of rotates around "functional docs" idea. So if you don't mind putting it all in some sort of docs, that would be very beneficial for the end user. After all paper will remain "static" but docs may change with time.

  • A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?

You probably have some idea on how to do docs. But if I can suggest to keep them in the repo i.e under docs/ in plain text e.g markdown, that I think works best. You can then use something like mkdocs or similar to compile into html and push to gh-pages. The advantage of keep them in the repo would be, "everything in one place", but also rather easy to contribute to the docs - clone the repo, write some markdown and PR back. Something that community might want to do as tools growth in popularity.

Cheers

I can definitely write some additional docs, but I am unsure how necessary it would be to create a site for them. Would adding a functional example or two to the README work?

I've added an example to the README, let me know what you think or if you think a site is needed.

I think it's up to you @j-andrews7 where you want to keep it. There is also wiki thing on github, but I've never used it.

looks great. I'll have more in depth read a bit later

thanks

@j-andrews7 @pjotrp and @arfon All is good and here is my final comment about the tools below. I do have a couple of question regarding finalising review process, will post them as a separate comment next.

Overall I think this is a great tool. It'll most likely become a go to app for me when I need to check the gene and associate info. The tool aggregates many different API's nicely making it very easy to get a thorough overview about given gene, including functions, expression and diseases associations. The tools works as described with very quick and simply installation. I've tested it out on my linux (debian distro) and macOS, both worked out of the box with less then a minute of install time required.

Going forward I would like to see some small improvements in the docs, mainly to attract new comers and make super easy to understand what each section is about. I found it a little distracting having to jump between _paper.md_ and _README.md_/docs. I would prefer docs with some table of content and all in one place.
I think adding additional "references links" to each sections websites docs pages for more in depth information would be rather useful. While one can use google for that, it feels like that would really complement the tool on the ease of use aspect.

I'd like to also suggest if small improvements can be made in Expression and Diseases sections going forward. The second panel in the expression sections - diagram view seems to update with every new query, but the table appears to be static with the same content irrespective of the query. I've noticed that if I try to click on a few different experiments some show expression and logFC but others show _no data is available_. Can it be that those data sets without expression data simply filtered out from that table? The Diseases section doesn't redirect to CTDbase, don't know if it should, seems like that would be useful. Unless I've missed something

Lastly I would also suggest considering webapp on top of existing electron app, mainly to attract more users and give them options. I feel that one would be more likely to test a tool with zero commitments i.e just hit "this" webpage. Regarding website hosting, I'm fairly confident you can do that through gh-page for free and all will be kept in the single repo.

All the best and thanks for the tool !

@pjotrp and @arfon

I'm little unsure about

Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?

section. while every code is gotta be tested I'm a little uncertain which tests to suggest for this particular tool? As I've mentioned everything appears to work fine.

Otherwise I'm happy to mark this as done

Thanks

Thanks for the comments @serine. I will continue to update the docs as I make improvements to the tool. As it stands, I'd assume most people might take a quick read over the paper, then mostly just go through the README.

As you've noted, some expression experiments don't have data that you might expect - typically these are differential expression experiments that will only display information if the gene is differentially expressed between whatever the two conditions are. Most of the RNA-Seq Baseline-type experiments are usually more reliable. Unfortunately, there's not much I can do about it - that's just how the data from Expression Atlas is, even through their website. I very much agree that it is worth mentioning in the docs, as is adding a ToC. The experiments table will only change if the species changes - the experiments available for Human remain static, but if you click on a Mouse hit, it should change to display the mouse experiments. If it is not, then you may be hitting a bug, which I'd suggest reporting as an issue in the Genotify repo.

I have updated the README to explicitly state those nuances, since I realize they aren't obvious, as well as to add a ToC to better organize it.

I will change the Disease section to add a column that directs back to CTDbase as well.

As for the webpage, it's an idea I'll consider, but it will take some effort to de-couple the code from some of the Electron intricacies. gh-pages also only lets you publish static HTML files, so you have no control over the backend, i.e. I couldn't install node.js or any of the other js libraries I'd need for it to work. I would have to host elsewhere, which isn't a big deal, but not quite as convenient. And as you said, I'm hoping the quick install will encourage people to at least give it a try.

Thanks for reviewing, and I'm glad you've found it a useful tool. I will implement the change above (and I've also found a few edge-case bugs since submitting), and prepare to make a release that will be used for the archive.

@pjotrp is there anything I need to do to update the release version for the submission? It's listed as 1.2.0 here, but my archive will be version 1.2.1.

@j-andrews7 that's great, thanks heaps for addressing comments.

I think I missed that genotify has a backend, apologies, in that case yes gh-pages is no good. And like I said this is completely up to you, just an idea to think about. The electron app is very good.

Cheers

@whedon generate pdf

Attempting PDF compilation. Reticulating splines etc...

The review process is now complete. To finalize your submission and accept your paper in JOSS, we need two things. First, can you confirm that all references in your bibliography have a DOI (if one exists).

Second, we need you to deposit a copy of your software repository (including any revisions made during the JOSS review process) with a data-archiving service. To do so:

  1. Create a GitHub release of the current version of your software repository
  2. Deposit that release with Zenodo, figshare, or a similar DOI issuer.
  3. Post a comment here with the DOI for the release.

DOIs are there. @arfon this submission is also ready for acceptance when we have a DOI for the software.

@serine thanks for the fast turnaround!!

Great, thanks again to @serine and @pjotrp for reviewing/editing this. @arfon the DOI is: 10.5281/zenodo.1345663

@whedon set 10.5281/zenodo.1345663 as archive

OK. 10.5281/zenodo.1345663 is the archive.

@serine - many thanks for your review here and to @pjotrp for editing this submission ✨

@j-andrews7 - your paper is now accepted into JOSS and your DOI is https://doi.org/10.21105/joss.00885 :zap: :rocket: :boom:

:tada::tada::tada: Congratulations on your paper acceptance! :tada::tada::tada:

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](http://joss.theoj.org/papers/10.21105/joss.00885/status.svg)](https://doi.org/10.21105/joss.00885)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.00885">
  <img src="http://joss.theoj.org/papers/10.21105/joss.00885/status.svg" alt="DOI badge" >
</a>

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Was this page helpful?
0 / 5 - 0 ratings