Joss-reviews: [REVIEW]: PyBIDS: Python tools for BIDS datasets

Created on 3 Mar 2019  Β·  87Comments  Β·  Source: openjournals/joss-reviews

Submitting author: @tyarkoni (Tal Yarkoni)
Repository: https://github.com/bids-standard/pybids
Version: 0.9.2
Editor: @cMadan
Reviewer: @grlee77
Archive: 10.5281/zenodo.3333492

Status

status

Status badge code:

HTML: <a href="http://joss.theoj.org/papers/2bc5802f0656395e0312f4243e4a61ce"><img src="http://joss.theoj.org/papers/2bc5802f0656395e0312f4243e4a61ce/status.svg"></a>
Markdown: [![status](http://joss.theoj.org/papers/2bc5802f0656395e0312f4243e4a61ce/status.svg)](http://joss.theoj.org/papers/2bc5802f0656395e0312f4243e4a61ce)

Reviewers and authors:

Please avoid lengthy details of difficulties in the review thread. Instead, please create a new issue in the target repository and link to those issues (especially acceptance-blockers) in the review thread below. (For completists: if the target issue tracker is also on GitHub, linking the review thread in the issue or vice versa will create corresponding breadcrumb trails in the link target.)

Reviewer instructions & questions

@grlee77, please carry out your review in this issue by updating the checklist below. If you cannot edit the checklist please:

  1. Make sure you're logged in to your GitHub account
  2. Be sure to accept the invite at this URL: https://github.com/openjournals/joss-reviews/invitations

The reviewer guidelines are available here: https://joss.theoj.org/about#reviewer_guidelines. Any questions/concerns please let @cMadan know.

✨ Please try and complete your review in the next two weeks ✨

Review checklist for @grlee77

Conflict of interest

Code of Conduct

General checks

  • [x] Repository: Is the source code for this software available at the repository url?
  • [x] License: Does the repository contain a plain-text LICENSE file with the contents of an OSI approved software license?
  • [x] Version: 0.9.2
  • [x] Authorship: Has the submitting author (@tyarkoni) made major contributions to the software? Does the full list of paper authors seem appropriate and complete?

Functionality

  • [x] Installation: Does installation proceed as outlined in the documentation?
  • [x] Functionality: Have the functional claims of the software been confirmed?
  • [x] Performance: If there are any performance claims of the software, have they been confirmed? (If there are no claims, please check off this item.)

Documentation

  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] Installation instructions: Is there a clearly-stated list of dependencies? Ideally these should be handled with an automated package management solution.
  • [x] Example usage: Do the authors include examples of how to use the software (ideally to solve real-world analysis problems).
  • [x] Functionality documentation: Is the core functionality of the software documented to a satisfactory level (e.g., API method documentation)?
  • [x] Automated tests: Are there automated tests or manual steps described so that the function of the software can be verified?
  • [x] Community guidelines: Are there clear guidelines for third parties wishing to 1) Contribute to the software 2) Report issues or problems with the software 3) Seek support

Software paper

  • [x] Authors: Does the paper.md file include a list of authors with their affiliations?
  • [x] A statement of need: Do the authors clearly state what problems the software is designed to solve and who the target audience is?
  • [x] References: Do all archival references that should have a DOI list one (e.g., papers, datasets, software)?
accepted published recommend-accept review

Most helpful comment

@arfon, do you have any thoughts on the author list formatting here? I don't recall any prior JOSS papers using "[LastName], [FirstName]" formatting, but you'd know better than me--and if we want to require a specific formatting (e.g., for consistency).

Yes, please have author names as a single string (firstname lastname) with any middle initials, e.g. Arfon M Smith

All 87 comments

Hello human, I'm @whedon, a robot that can help you with some common editorial tasks. @grlee77 it looks like you're currently assigned as the reviewer for this paper :tada:.

:star: Important :star:

If you haven't already, you should seriously consider unsubscribing from GitHub notifications for this (https://github.com/openjournals/joss-reviews) repository. As a reviewer, you're probably currently watching this repository which means for GitHub's default behaviour you will receive notifications (emails) for all reviews 😿

To fix this do the following two things:

  1. Set yourself as 'Not watching' https://github.com/openjournals/joss-reviews:

watching

  1. You may also like to change your default settings for this watching repositories in your GitHub profile here: https://github.com/settings/notifications

notifications

For a list of things I can do to help you, just type:

@whedon commands
Attempting PDF compilation. Reticulating splines etc...

@tyarkoni, I see that you have list your authors as "[LastName], [FirstName]". I'm not sure if that's actually against our rules, but normally authors are just listed as "[FirstName] [LastName]". Is there a particular reason you decided to list them this way?

I adapted it from the repo's .zenodo.json file. Convention for that seems to be [LastName], [FirstName].

@arfon, do you have any thoughts on the author list formatting here? I don't recall any prior JOSS papers using "[LastName], [FirstName]" formatting, but you'd know better than me--and if we want to require a specific formatting (e.g., for consistency).

@grlee77, let me know if you have any questions in starting the review!

@arfon, do you have any thoughts on the author list formatting here? I don't recall any prior JOSS papers using "[LastName], [FirstName]" formatting, but you'd know better than me--and if we want to require a specific formatting (e.g., for consistency).

Yes, please have author names as a single string (firstname lastname) with any middle initials, e.g. Arfon M Smith

I opened a couple of minor issues:
bids-standard/pybids#415
bids-standard/pybids#416

and commented on (bids-standard/pybids#245) after an initial read-through of the docs and examples. I plan to finish the checklist and review by tomorrow.

It looks like the approach here was to include all contributors to the repository. I just had a question about whether all coauthors have already actively confirmed their desire to be included?

@cmadan: If past contributors have not been notified and explicitly agreed to be included in the publication then it seems they would need to be removed from the list? The JOSS author guidelines state

"The authors themselves assume responsibility for deciding who should be credited with co-authorship, and co-authors must always agree to be listed. In addition, co-authors agree to be accountable for all aspects of the work."

edit: on second glance it seems that the number of authors is a little less than the total number of contributors, so perhaps the list has already been pruned? I am just curious on what the decision mechanism was for inclusion or if there are any unintentional omissions.

The submitted paper is well written and nicely summarizes the capabilities of the software. The proposed software is likely to be useful to any researchers working with data in the BIDS format and complements an increasingly diverse ecosystem of python-based neuroimaging analysis software.

The software functions as advertised. I was able to run the included example notebook without issues and tried out some other variations on the examples given in the documentation.

Aside from the minor documentation issues already raised above, the following minor issues in relation to the review criteria above were found:

1.) In regards to the "Community guidelines" criterion, I did not see any mention about how to contribute or raise issues/get help in the documentation. This is likely self-evident to those already working with GitHub/open source, but it would be useful to include something brief to guide new contributors.

2.) Some of the references are missing a DOI in paper.bib and the rendered PDF.

3.) The software is tested using automated CI across platforms and python versions, but there are currently a handful of test failures on the master branch (Appveyor is listed as failing on the main project page)

The following are not specific to review checklist above, but more about two things that were not entirely clear to me as a new user when going through the examples in the documentation.

1.) There was an example using the path_patterns argument to the build_path method of a BIDSLayout. However, the API docs do not seem to explicitly describe the expected syntax for this argument.

2.) In the documentation section on generating reports, it is mentioned that the reports are returned as a collections.Counter, but it isn't mentioned why this would be useful format or much detail on how it should typically be used. The example shown just uses

main_report = counter.most_common()[0][0]

but does not comment on why the indexing [0][0] is needed or when one would typically want or need to use the other elements returned by most_common?

@grlee77 all authors did not actively confirm their desire to be included; instead I gave people the option of opting out (and none did). Does that work? I can also repeat the request, this time requesting opt-in, but I'm concerned that there may be some people who contributed to the repo a while back and aren't paying close attention (I know from experience that there can be a mental hurdle in following up on such requests, even if it only takes a minute or two). Similarly, for some of the authors, I have no contact information aside from GitHub handle, and a few folks don't seem to be active.

1.) In regards to the "Community guidelines" criterion, I did not see any mention about how to contribute or raise issues/get help in the documentation. This is likely self-evident to those already working with GitHub/open source, but it would be useful to include something brief to guide new contributors.

Thanks for catching this. I've added a "Community guidelines" section to the top-level README (see bids-standard/pybids#418).

2.) Some of the references are missing a DOI in paper.bib and the rendered PDF.

This is fixed now; see rebuilt PDF below.

3.) The software is tested using automated CI across platforms and python versions, but there are currently a handful of test failures on the master branch (Appveyor is listed as failing on the main project page)

We currently don't officially support Windows, so this isn't technically an issue for us (though I realize it doesn't look great). We haven't removed the Appveyor tests in the hopes that someone might spontaneously come along and fix this for us (it will likely require a fairly thorough dive through the codebase), but I've added a note to the README indicating the scope of official support.

There was an example using the path_patterns argument to the build_path method of a BIDSLayout. However, the API docs do not seem to explicitly describe the expected syntax for this argument.

This was documented elsewhere (in the writing module), but I've copied the relevant part into the build_path docstring.

2.) In the documentation section on generating reports, it is mentioned that the reports are returned as a collections.Counter, but it isn't mentioned why this would be useful format or much detail on how it should typically be used.

@tsalo, could you comment on this and/or make any required changes? Thanks!

@whedon generate pdf from branch joss-review

Attempting PDF compilation from custom branch joss-review. Reticulating splines etc...

@grlee77 I think I've now addressed all of the comments in this thread and on the pybids repo, pending one minor issue that I've asked @tsalo (who authored the code in question) to look at. The joss-review branch collects all changes. Please let me know if there's anything else. Thanks for your effort and excellent feedback!

BIDSReport returns a Counter because missing data or variable acquisition parameters across participants can lead to different reports. With a Counter, users can access the most common report, which is likely to be the most accurate in cases with missing data, or they can access everything if different participants had different protocols. I can add this rationale to the documentation somewhere or can make BIDSReport only return the most common report, if you'd like.

I think adding that rationale to the docstring would work. Feel free to add to the joss-review branch (I added you as a collaborator).

@grlee77 all authors did not actively confirm their desire to be included; instead I gave people the option of opting out (and none did). Does that work?

That seems fine to me, but let's check with @cMadan if that is typically okay for JOSS?

As another data point, there was a metadata gathering thread (https://github.com/bids-standard/pybids/pull/308) for Zenodo. There were a couple people who never responded, but none who requested not to be authors. I recognize Zenodo isn't JOSS, but if you're looking for active consent to be considered authors on the software (if not the paper), that thread is a worthwhile one.

Thanks for addressing the issues @tyarkoni. I reviewed the changes in the joss-review branch and they look good to me. I just checked off the remaining items in the review checklist.

We currently don't officially support Windows, so this isn't technically an issue for us (though I realize it doesn't look great).

I suppose you could remove the Appveyor badge from the readme until support is officially added, but not a big deal either way

@grlee77 all authors did not actively confirm their desire to be included; instead I gave people the option of opting out (and none did). Does that work?

Most of the other JOSS submissions I've reviewed had shorter author lists and everyone was aware of their inclusion.

@openjournals/joss-eics, is there any precedence of having JOSS co-authorship be opt-out as described here?

@openjournals/joss-eics, is there any precedence of having JOSS co-authorship be opt-out as described here?

Not that I know of. I'd be curious to hear how you notified them @tyarkoni?

I opened a PR in the repo soliciting feedback, plus we had the earlier discussion related to the Zenodo DOI @effigies linked to above. I could have sworn I explicitly mentioned authorship, but reading the various issues and PRs over, it appears not. Sorry about that!

Happy to raise the issue more explicitly, and will tag everyone by name this time. I still feel pretty strongly that making it opt-in will result in several people who are happy to be authors and have made meaningful contributions being left off, but if you prefer opt-in to opt-out, I'm fine with that. We call also trial the former and then make a decision based on feedback.

[Edit: FWIW, this seems like an issue that will arise again in future; e.g., I have other repos with a lot of contributors that I plan to submit to JOSS at some point. So it might make sense to address this explicitly in the guidelines one way or the other.]

Okay, see the request here. I haven't indicated exactly what we'll take a non-response to mean, in the hopes of pushing people to respond, but let's see what happens.

@tyarkoni, how have things gone with getting people to confirm co-authorship? (E.g., how many have confirmed, how many have not responded)

Thanks for the reminder! I think ~3 people haven't responded (am on phone and can't easily look). I guess despite my earlier reservations, it may be best to just go ahead without those folks. Maybe I'll try to track them down via email if I can find one, but assuming I don't hear back in the next day or two, should we stick to the standard opt-in approach?

@tyarkoni, yes, that sounds like even more reason to stick with the standard opt-in approach. It would be good to try and email those last few people in case they're not so active on GitHub. Let me know how it goes!

@whedon generate pdf from branch joss-review

Attempting PDF compilation from custom branch joss-review. Reticulating splines etc...

@whedon generate pdf from branch joss-review

Attempting PDF compilation from custom branch joss-review. Reticulating splines etc...

@cMadan apologies for the delay: the author list is now finalized. I think we're good to go. We have a fairly major PyBIDS release coming up (0.9), so we could in theory wait on that, but I don't know that it matters much (it won't change the text in any way).

Thanks for getting back to this, Tal. In that case I think it would be better to make 0.9 the version associated with the JOSS paper.

I also looked over the paper and it looks like the bib needs a bit of editing:

  • Fix Gorgolweski's initials across the entries so they are formatted the same. Currently this is not the case, which is why the PDF prints the first initials in the citations.
  • Spell out full journal names in all references and capitalise appropriately.
  • Capitalise "MRI" in article titles.
  • Since we will have the Zenodo archive DOI on the front page of the paper, as part of the metadata shown in the margin note, we don't include a citation to it or mention it in the paper. Can you delete that reference?

Also:

  • Add space after close parentheses in "fitlins)--", otherwise URL gets broken.

@tyarkoni, can you give an update here?

Sorry, I've been traveling. Will get to this ASAP–probably post-OHBM.

Hi @tyarkoni, have you had a chance to work on this? If not, can you give us an anticipated date? More time is fine, we can set an automated reminder, or pause the submission if it will take a very long time.

@kyleniemeyer @cMadan I went ahead and addressed these concerns: https://github.com/bids-standard/pybids/pull/452

If possible, it would be nice to try to render before merging to master, to avoid too many PRs. I don't know if there are docs on how to render locally. The main thing I'm not sure of is whether using the bracket ({}) syntax will resolve the issues with KJG's name in citations.

@effigies you might check out the compile script another JOSS author put together, which I've used successfully to locally build a paper fairly easily: https://github.com/connorourke/crystal_torture/blob/master/paper/compile

@effigies, it's possible to generate the pdf from a branch @whedon generate pdf from branch custom-branch-name, but it looks like your current PR is on a branch of a fork, which is not supported.

@whedon generate pdf from branch joss-review

Attempting PDF compilation from custom branch joss-review. Reticulating splines etc...

@cMadan Thanks for the tip. Made one more change, so I'll re-render and I think we'll be ready for a final review, pending @tyarkoni approval of the changes I've made.

Edit: Apparently I need to make whedon commands the only thing in a post.

@whedon generate pdf from branch joss-review

Attempting PDF compilation from custom branch joss-review. Reticulating splines etc...

@tyarkoni, is this now ready for a final review?

@cMadan I'm pretty sure he's still on vacation.

Sorry, yesβ€”ready for (hopefully) final review!

@cMadan Just a bump on this, in case you didn't see the last message. NBD if you've got other life going on.

@tyarkoni @effigies, everything looks good to me!

To move forward with accepting your submission, there are a few last things to take care of:

  • [ ] Make a tagged release of your software, and list the version tag of the archived version here.
  • [ ] Archive the reviewed software in Zenodo
  • [ ] Check the Zenodo deposit has the correct metadata, this includes the title (should match the paper title) and author list (make sure the list is correct and people who only made a small fix are not on it); you may also add the authors' ORCID.
  • [ ] List the Zenodo DOI of the archived version here.

You may find this helpful: https://guides.github.com/activities/citable-code/

Hi @cMadan I'm not sure I understand. We have a release 0.9.2 and a Zenodo (https://doi.org/10.5281/zenodo.3333492). I think the author list is slightly different because not all could be contacted to give explicit consent to be on the JOSS paper. And the title is not the same.

I guess the idea is to create a separate release? Is there a naming convention to follow, e.g. 0.9.2-joss? And I'm not sure what the threshold for too small of a contribution might be, but once identified, if we remove them from the Zenodo, presumably we need to remove them from the paper as well?

@effigies, I think those discrepencies are fine then.

Any thoughts, @openjournals/joss-eics ?

JOSS requests that you make a new release of the software after applying any revisions that occurred during the review. That new release should be deposited in Zenodo, for archival of the reviewed software.

Our expectation is that the Zenodo archive corresponding to the accepted JOSS paper have the same title and author list as the JOSS paper.

If I understand correctly, you're saying that some of the contributors to the software are included in the author list of the Zenodo archive, but are not listed on the JOSS paper because you couldn't contact them to agree.

What is the difference between listing someone as author on a Zenodo archive and a JOSS paper? Is there any? Why can they be listed without their approval in the Zenodo archive, but not in the JOSS paper? I don't have answers to this. We do leave it to the authors to decide on their authorship. But it is a bit strange to have a different author list on an archive of a software and a paper about that same software.

JOSS requests that you make a new release of the software after applying any revisions that occurred during the review. That new release should be deposited in Zenodo, for archival of the reviewed software.

That's the case. All JOSS-related revisions are in the most recent release.

Our expectation is that the Zenodo archive corresponding to the accepted JOSS paper have the same title and author list as the JOSS paper.

We can probably update the title... @tyarkoni do you have ownership of the Zenodo entry?

What is the difference between listing someone as author on a Zenodo archive and a JOSS paper? Is there any? Why can they be listed without their approval in the Zenodo archive, but not in the JOSS paper?

We settled on opt-out for Zenodo in https://github.com/bids-standard/pybids/pull/308, although we did get a very high rate of affirmative response, with I think only 7 failures to respond. As to why we chose opt-out for Zenodo, the default behavior of Zenodo is to list all contributors in the git history (or possibly some other GitHub metadata...), so removing a contributor for failing to respond seemed punitive.

At an above request, we sought an additional explicit opt-in for JOSS in https://github.com/bids-standard/pybids/pull/395, with 3 failures to respond. I personally find the discrepancy acceptable. Software authorship credits do not have the same reputational implications of paper authorship, and JOSS seems to fall enough on the paper side of things to justify an active opt-in, which matches your policies.


However, I leave it to you. We can make a second tag on with only JOSS-consenting authors. If so, please let me know what a sensible tag would be.

@arfon β€” Are we OK with an author-list discrepancy between the JOSS paper and Zenodo archive, for reasons described above?

@arfon β€” Are we OK with an author-list discrepancy between the JOSS paper and Zenodo archive, for reasons described above?

Yes, that’s fine.

@effigies, thanks for the detailed response. I think the only thing remaining then is for @tyarkoni to update the title in Zenodo.

Turns out, @tyarkoni doesn't have the magic touch for Zenodo. It's @yarikoptic who started the archive (https://github.com/bids-standard/pybids/issues/299), so only he has the power.

Yarik, would you mind updating the title of https://zenodo.org/record/3333492 (specifically that one) to "PyBIDS: Python tools for BIDS datasets".

  • how do I share the super powers?
  • updated title for that one and 0.9.3 -- clicked Save -- but do not see that reflected anywhere

I haven't been able to find a way to share super powers on my repositories. We should probably open an issue with Zenodo, because this is extremely inconvenient for projects that don't have a BDFL.

Edit: This was reported by a familiar face two years ago. https://github.com/zenodo/zenodo/issues/1325

Related to zenodo/zenodo#35, zenodo/zenodo#151, zenodo/zenodo#163

@yarikoptic I wonder if hitting "Publish" instead of "Save" would make a difference.

@cMadan I'm going to see if I can update the title through our metadata (bids-standard/pybids#470). In which case, we'll go with the 0.9.4 tag.

@yarikoptic I wonder if hitting "Publish" instead of "Save" would make a difference.

Sorry for the delay!!! I just did it and it published. But now it doesn't show the version in the title. I wonder if I should add it (PyBIDS: Python tools for BIDS datasets (0.9.2)) and also adjust subsequent release records?

@yarikoptic The version is listed on the page, so I'm not too worried.

@cMadan WDYT?

@yarikoptic @effigies, I don't think the version needs to be in the title since it's prominent elsewhere.

@whedon set 0.9.3 as version

OK. 0.9.3 is the version.

@whedon set 0.9.2 as version

OK. 0.9.2 is the version.

@whedon set 10.5281/zenodo.3333492 as archive

OK. 10.5281/zenodo.3333492 is the archive.

@openjournals/joss-eics, I think we're all set to accept here!

@whedon accept

Attempting dry run of processing paper acceptance...

Check final proof :point_right: https://github.com/openjournals/joss-papers/pull/896

If the paper PDF and Crossref deposit XML look good in https://github.com/openjournals/joss-papers/pull/896, then you can now move forward with accepting the submission by compiling again with the flag deposit=true e.g.
@whedon accept deposit=true

@whedon accept deposit=true

Doing it live! Attempting automated processing of paper acceptance...

Thanks to @grlee77 for reviewing and to @cMadan for editing

🐦🐦🐦 πŸ‘‰ Tweet for this paper πŸ‘ˆ 🐦🐦🐦

🚨🚨🚨 THIS IS NOT A DRILL, YOU HAVE JUST ACCEPTED A PAPER INTO JOSS! 🚨🚨🚨

Here's what you must now do:

  1. Check final PDF and Crossref metadata that was deposited :point_right: https://github.com/openjournals/joss-papers/pull/897
  2. Wait a couple of minutes to verify that the paper DOI resolves https://doi.org/10.21105/joss.01294
  3. If everything looks good, then close this review issue.
  4. Party like you just published a paper! πŸŽ‰πŸŒˆπŸ¦„πŸ’ƒπŸ‘»πŸ€˜

    Any issues? notify your editorial technical team...

:tada::tada::tada: Congratulations on your paper acceptance! :tada::tada::tada:

If you would like to include a link to your paper from your README use the following code snippets:

Markdown:
[![DOI](https://joss.theoj.org/papers/10.21105/joss.01294/status.svg)](https://doi.org/10.21105/joss.01294)

HTML:
<a style="border-width:0" href="https://doi.org/10.21105/joss.01294">
  <img src="https://joss.theoj.org/papers/10.21105/joss.01294/status.svg" alt="DOI badge" >
</a>

reStructuredText:
.. image:: https://joss.theoj.org/papers/10.21105/joss.01294/status.svg
   :target: https://doi.org/10.21105/joss.01294

This is how it will look in your documentation:

DOI

We need your help!

Journal of Open Source Software is a community-run journal and relies upon volunteer effort. If you'd like to support us please consider doing either one (or both) of the the following:

Was this page helpful?
0 / 5 - 0 ratings